In [1]:
from IPython.display import Image
Image("home/nati/Pictures/otto_competition.png")
Out[1]:
The goals of this tutorial notebook are to: a) introduce you to the process and approach for performing Exploratory Data Analysis (EDA) b) get you train various classifiers and explore their results c) use these trained models to predict the target variable (in this example dataset it is the type of a product)
lets begin with importing some common libraries we discussed about in the previous part.
In [2]:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
import seaborn as sns
from sklearn.model_selection import train_test_split
my_color_map = ['green','aqua','pink','blue','red','black','yellow','teal','orange','grey']
now lets load our data set for this tutorial: the Otto dataset
In [3]:
tr_data = pd.read_csv('../input/train.csv')
te_data = pd.read_csv('../input/test.csv')
print 'train shape is: {} \r\n\
test shape is: {}'.format(tr_data.shape,te_data.shape)
train shape is: (61878, 95)
test shape is: (144368, 94)
pandas has lots of great features that can help us get insights to the data with very little effort lets begin with exploring some statistics of the numerical features:
In [4]:
tr_data.describe()
Out[4]:
id
feat_1
feat_2
feat_3
feat_4
feat_5
feat_6
feat_7
feat_8
feat_9
...
feat_84
feat_85
feat_86
feat_87
feat_88
feat_89
feat_90
feat_91
feat_92
feat_93
count
61878.000000
61878.00000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
...
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
61878.000000
mean
30939.500000
0.38668
0.263066
0.901467
0.779081
0.071043
0.025696
0.193704
0.662433
1.011296
...
0.070752
0.532306
1.128576
0.393549
0.874915
0.457772
0.812421
0.264941
0.380119
0.126135
std
17862.784315
1.52533
1.252073
2.934818
2.788005
0.438902
0.215333
1.030102
2.255770
3.474822
...
1.151460
1.900438
2.681554
1.575455
2.115466
1.527385
4.597804
2.045646
0.982385
1.201720
min
1.000000
0.00000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
...
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
25%
15470.250000
0.00000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
...
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
50%
30939.500000
0.00000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
...
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
75%
46408.750000
0.00000
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
1.000000
0.000000
...
0.000000
0.000000
1.000000
0.000000
1.000000
0.000000
0.000000
0.000000
0.000000
0.000000
max
61878.000000
61.00000
51.000000
64.000000
70.000000
19.000000
10.000000
38.000000
76.000000
43.000000
...
76.000000
55.000000
65.000000
67.000000
30.000000
61.000000
130.000000
52.000000
19.000000
87.000000
8 rows × 94 columns
this format is somewhat problematic since:
1) when we scroll aside we notice that not all columns are presented so we cannot explore them
2) the data is very wide and we're not using the screen very efficiently
we can solve the first problem by setting some of pandas display parameters as for the screen usage - we can transpose the resulting dataframe
In [5]:
#set number of rows and columns to see
pd.options.display.max_rows = 200
pd.options.display.max_columns = 50
#use transposed view of the features
tr_data.describe().T
Out[5]:
count
mean
std
min
25%
50%
75%
max
id
61878.0
30939.500000
17862.784315
1.0
15470.25
30939.5
46408.75
61878.0
feat_1
61878.0
0.386680
1.525330
0.0
0.00
0.0
0.00
61.0
feat_2
61878.0
0.263066
1.252073
0.0
0.00
0.0
0.00
51.0
feat_3
61878.0
0.901467
2.934818
0.0
0.00
0.0
0.00
64.0
feat_4
61878.0
0.779081
2.788005
0.0
0.00
0.0
0.00
70.0
feat_5
61878.0
0.071043
0.438902
0.0
0.00
0.0
0.00
19.0
feat_6
61878.0
0.025696
0.215333
0.0
0.00
0.0
0.00
10.0
feat_7
61878.0
0.193704
1.030102
0.0
0.00
0.0
0.00
38.0
feat_8
61878.0
0.662433
2.255770
0.0
0.00
0.0
1.00
76.0
feat_9
61878.0
1.011296
3.474822
0.0
0.00
0.0
0.00
43.0
feat_10
61878.0
0.263906
1.083340
0.0
0.00
0.0
0.00
30.0
feat_11
61878.0
1.252869
3.042333
0.0
0.00
0.0
1.00
38.0
feat_12
61878.0
0.140874
0.567089
0.0
0.00
0.0
0.00
30.0
feat_13
61878.0
0.480979
2.014697
0.0
0.00
0.0
0.00
72.0
feat_14
61878.0
1.696693
3.163212
0.0
0.00
0.0
2.00
33.0
feat_15
61878.0
1.284398
3.862236
0.0
0.00
0.0
1.00
46.0
feat_16
61878.0
1.413459
2.226163
0.0
0.00
0.0
2.00
37.0
feat_17
61878.0
0.366108
1.477436
0.0
0.00
0.0
0.00
43.0
feat_18
61878.0
0.575423
1.335985
0.0
0.00
0.0
1.00
32.0
feat_19
61878.0
0.551699
4.636145
0.0
0.00
0.0
0.00
121.0
feat_20
61878.0
0.471525
1.438727
0.0
0.00
0.0
0.00
27.0
feat_21
61878.0
0.204014
0.696050
0.0
0.00
0.0
0.00
14.0
feat_22
61878.0
0.729969
1.446220
0.0
0.00
0.0
1.00
22.0
feat_23
61878.0
0.142522
0.782979
0.0
0.00
0.0
0.00
64.0
feat_24
61878.0
2.643880
4.629015
0.0
0.00
1.0
3.00
263.0
feat_25
61878.0
1.534520
2.332994
0.0
0.00
1.0
2.00
30.0
feat_26
61878.0
0.563108
1.710305
0.0
0.00
0.0
0.00
33.0
feat_27
61878.0
0.696613
2.873222
0.0
0.00
0.0
0.00
123.0
feat_28
61878.0
0.238970
0.828112
0.0
0.00
0.0
0.00
22.0
feat_29
61878.0
0.275768
1.901294
0.0
0.00
0.0
0.00
69.0
feat_30
61878.0
0.150312
1.640880
0.0
0.00
0.0
0.00
87.0
feat_31
61878.0
0.148680
0.897354
0.0
0.00
0.0
0.00
59.0
feat_32
61878.0
1.043796
2.416849
0.0
0.00
0.0
1.00
149.0
feat_33
61878.0
0.696516
1.310202
0.0
0.00
0.0
1.00
24.0
feat_34
61878.0
0.946411
3.368622
0.0
0.00
0.0
1.00
84.0
feat_35
61878.0
0.666263
3.197965
0.0
0.00
0.0
0.00
105.0
feat_36
61878.0
0.709089
2.555119
0.0
0.00
0.0
1.00
84.0
feat_37
61878.0
0.263632
0.756934
0.0
0.00
0.0
0.00
22.0
feat_38
61878.0
0.582129
1.602579
0.0
0.00
0.0
1.00
39.0
feat_39
61878.0
0.485585
3.298315
0.0
0.00
0.0
0.00
78.0
feat_40
61878.0
1.653059
3.299798
0.0
0.00
0.0
2.00
41.0
feat_41
61878.0
0.303468
1.085672
0.0
0.00
0.0
0.00
36.0
feat_42
61878.0
0.698019
1.961189
0.0
0.00
0.0
1.00
41.0
feat_43
61878.0
0.451146
1.706013
0.0
0.00
0.0
0.00
42.0
feat_44
61878.0
0.560829
1.346090
0.0
0.00
0.0
1.00
34.0
feat_45
61878.0
0.238130
2.587131
0.0
0.00
0.0
0.00
80.0
feat_46
61878.0
0.641375
2.348359
0.0
0.00
0.0
0.00
41.0
feat_47
61878.0
0.249669
1.446203
0.0
0.00
0.0
0.00
47.0
feat_48
61878.0
1.584893
2.577071
0.0
0.00
1.0
2.00
49.0
feat_49
61878.0
0.348314
1.369380
0.0
0.00
0.0
0.00
81.0
feat_50
61878.0
0.324283
1.720470
0.0
0.00
0.0
0.00
73.0
feat_51
61878.0
0.053298
0.513820
0.0
0.00
0.0
0.00
44.0
feat_52
61878.0
0.213485
1.044788
0.0
0.00
0.0
0.00
48.0
feat_53
61878.0
0.442063
2.006485
0.0
0.00
0.0
0.00
53.0
feat_54
61878.0
2.072465
4.113319
0.0
0.00
0.0
2.00
63.0
feat_55
61878.0
0.323120
0.998743
0.0
0.00
0.0
0.00
27.0
feat_56
61878.0
0.303775
1.925806
0.0
0.00
0.0
0.00
62.0
feat_57
61878.0
0.309108
1.082148
0.0
0.00
0.0
0.00
30.0
feat_58
61878.0
0.697970
3.983722
0.0
0.00
0.0
0.00
117.0
feat_59
61878.0
0.388603
2.577693
0.0
0.00
0.0
0.00
97.0
feat_60
61878.0
1.029930
3.028469
0.0
0.00
0.0
0.00
40.0
feat_61
61878.0
0.239746
1.017553
0.0
0.00
0.0
0.00
38.0
feat_62
61878.0
1.187563
2.666742
0.0
0.00
0.0
1.00
56.0
feat_63
61878.0
0.168590
0.946158
0.0
0.00
0.0
0.00
51.0
feat_64
61878.0
1.256796
3.402080
0.0
0.00
0.0
1.00
73.0
feat_65
61878.0
0.222228
0.783052
0.0
0.00
0.0
0.00
38.0
feat_66
61878.0
0.571706
1.361874
0.0
0.00
0.0
1.00
36.0
feat_67
61878.0
2.897653
4.974322
0.0
0.00
1.0
4.00
104.0
feat_68
61878.0
0.392902
1.761054
0.0
0.00
0.0
0.00
109.0
feat_69
61878.0
0.811128
4.111091
0.0
0.00
0.0
0.00
76.0
feat_70
61878.0
0.892789
1.941368
0.0
0.00
0.0
1.00
46.0
feat_71
61878.0
0.319290
1.162443
0.0
0.00
0.0
0.00
31.0
feat_72
61878.0
0.858722
2.411646
0.0
0.00
0.0
1.00
30.0
feat_73
61878.0
0.591050
5.783233
0.0
0.00
0.0
0.00
352.0
feat_74
61878.0
0.579851
3.757822
0.0
0.00
0.0
0.00
231.0
feat_75
61878.0
0.726817
3.200095
0.0
0.00
0.0
0.00
80.0
feat_76
61878.0
0.748457
2.920038
0.0
0.00
0.0
0.00
102.0
feat_77
61878.0
0.124196
0.906621
0.0
0.00
0.0
0.00
29.0
feat_78
61878.0
0.366415
2.778317
0.0
0.00
0.0
0.00
80.0
feat_79
61878.0
0.300446
1.285569
0.0
0.00
0.0
0.00
25.0
feat_80
61878.0
0.698067
2.245671
0.0
0.00
0.0
0.00
54.0
feat_81
61878.0
0.078461
0.461244
0.0
0.00
0.0
0.00
26.0
feat_82
61878.0
0.187983
0.836269
0.0
0.00
0.0
0.00
24.0
feat_83
61878.0
0.496719
2.434921
0.0
0.00
0.0
0.00
79.0
feat_84
61878.0
0.070752
1.151460
0.0
0.00
0.0
0.00
76.0
feat_85
61878.0
0.532306
1.900438
0.0
0.00
0.0
0.00
55.0
feat_86
61878.0
1.128576
2.681554
0.0
0.00
0.0
1.00
65.0
feat_87
61878.0
0.393549
1.575455
0.0
0.00
0.0
0.00
67.0
feat_88
61878.0
0.874915
2.115466
0.0
0.00
0.0
1.00
30.0
feat_89
61878.0
0.457772
1.527385
0.0
0.00
0.0
0.00
61.0
feat_90
61878.0
0.812421
4.597804
0.0
0.00
0.0
0.00
130.0
feat_91
61878.0
0.264941
2.045646
0.0
0.00
0.0
0.00
52.0
feat_92
61878.0
0.380119
0.982385
0.0
0.00
0.0
0.00
19.0
feat_93
61878.0
0.126135
1.201720
0.0
0.00
0.0
0.00
87.0
another great feature of the pandas package is the simplisity of exploring the values distribution of the target variable & for each of the feature
In [6]:
print 'the value counts of the target are:'
print tr_data.iloc[:,-1].value_counts()
print tr_data.iloc[:,-1].value_counts().plot(kind = 'bar')
the value counts of the target are:
Class_2 16122
Class_6 14135
Class_8 8464
Class_3 8004
Class_9 4955
Class_7 2839
Class_5 2739
Class_4 2691
Class_1 1929
Name: target, dtype: int64
Axes(0.125,0.125;0.775x0.775)
In [7]:
for i,feat in enumerate(tr_data.columns[1:-1]): #we start from the second feature as the first one is the item id
print 'the value counts of feature {} are:'.format(feat)
print tr_data[feat].value_counts()
the value counts of feature feat_1 are:
0 51483
1 5906
2 1829
3 981
4 521
5 471
6 207
7 192
8 71
9 55
22 32
11 24
10 15
13 10
15 8
26 6
19 5
24 5
12 5
28 4
14 4
21 4
23 4
17 4
48 3
16 3
27 3
25 3
39 2
31 2
42 2
43 2
30 2
47 2
56 1
20 1
37 1
34 1
29 1
61 1
32 1
40 1
Name: feat_1, dtype: int64
the value counts of feature feat_2 are:
0 55018
1 4012
2 1215
3 549
4 310
5 170
6 155
7 84
10 60
8 53
9 51
12 41
11 38
14 30
13 17
15 15
16 10
18 8
21 7
17 7
20 5
19 5
25 2
23 2
24 2
26 1
27 1
35 1
36 1
39 1
37 1
22 1
38 1
51 1
41 1
30 1
31 1
Name: feat_2, dtype: int64
the value counts of feature feat_3 are:
0 49295
1 5346
2 1674
3 909
4 643
5 529
6 486
8 381
7 377
9 355
10 332
11 274
12 229
13 198
14 160
15 122
16 118
17 87
18 83
19 51
20 48
22 30
21 25
23 24
24 22
25 12
28 11
26 9
27 8
31 6
30 6
44 3
38 3
36 3
29 3
32 2
37 2
34 2
59 1
64 1
61 1
41 1
35 1
40 1
50 1
49 1
42 1
52 1
Name: feat_3, dtype: int64
the value counts of feature feat_4 are:
0 48448
1 5947
2 2297
3 1258
4 949
5 629
6 493
7 353
8 283
9 199
10 183
11 146
12 111
14 72
13 68
16 53
15 45
18 39
20 35
22 30
17 30
19 18
24 17
30 16
32 15
26 14
21 12
23 12
34 11
25 10
27 10
28 8
33 6
36 6
38 4
40 4
46 4
44 4
42 4
29 3
50 3
56 3
35 3
55 2
31 2
48 2
58 2
57 2
54 2
67 2
41 1
60 1
52 1
68 1
51 1
37 1
70 1
47 1
63 1
Name: feat_4, dtype: int64
the value counts of feature feat_5 are:
0 58907
1 2357
2 342
3 110
4 56
7 23
5 21
6 19
10 14
8 13
9 6
11 5
12 3
19 1
13 1
Name: feat_5, dtype: int64
the value counts of feature feat_6 are:
0 60710
1 883
2 201
3 57
5 14
4 10
10 1
8 1
6 1
Name: feat_6, dtype: int64
the value counts of feature feat_7 are:
0 56443
1 3346
2 905
3 417
4 231
5 146
6 87
7 61
8 57
9 35
10 32
11 22
13 16
12 13
14 12
15 12
18 9
20 6
17 5
19 5
16 4
38 3
21 3
31 2
30 1
22 1
26 1
27 1
29 1
32 1
Name: feat_7, dtype: int64
the value counts of feature feat_8 are:
0 45312
1 9332
2 3485
3 1239
4 749
5 379
6 310
7 188
8 152
9 134
10 107
11 60
12 52
13 52
14 51
15 41
16 32
18 29
17 25
19 22
21 11
20 8
32 8
22 6
39 6
26 5
25 5
40 5
42 5
23 5
36 4
29 4
33 4
41 4
24 4
27 4
38 4
43 4
28 3
30 3
35 3
37 3
75 2
34 2
31 2
45 2
56 2
54 2
48 1
46 1
59 1
52 1
76 1
51 1
57 1
Name: feat_8, dtype: int64
the value counts of feature feat_9 are:
0 49836
1 5476
2 1887
3 668
4 399
14 373
13 357
12 355
11 290
15 236
10 225
16 203
5 183
9 172
17 164
18 146
6 120
19 117
8 104
7 100
20 92
21 78
22 55
23 51
24 43
25 34
27 29
26 28
28 14
29 12
30 8
32 5
34 5
31 5
33 2
43 2
35 1
37 1
38 1
41 1
Name: feat_9, dtype: int64
the value counts of feature feat_10 are:
0 54195
1 4531
2 1526
3 612
4 314
5 193
6 129
7 89
8 62
9 55
10 37
11 32
12 24
13 22
14 17
16 8
15 7
22 6
17 5
19 3
26 2
18 2
20 2
24 2
30 2
25 1
Name: feat_10, dtype: int64
the value counts of feature feat_11 are:
0 45043
1 6703
2 1959
7 1155
8 1134
9 972
6 823
3 765
10 705
5 532
4 486
11 451
12 283
13 181
14 141
16 134
15 110
17 92
18 75
19 61
20 31
21 21
22 9
23 4
24 4
25 2
38 1
26 1
Name: feat_11, dtype: int64
the value counts of feature feat_12 are:
0 55342
1 5279
2 879
3 221
4 76
5 23
6 16
7 12
8 8
10 5
9 4
28 3
17 2
19 2
11 1
12 1
13 1
21 1
25 1
30 1
Name: feat_12, dtype: int64
the value counts of feature feat_13 are:
0 50430
1 6816
2 2163
3 706
4 360
5 326
10 258
11 210
6 128
12 84
7 70
13 54
30 36
8 36
9 34
19 31
14 19
20 12
15 11
17 11
16 9
18 9
25 7
21 7
22 7
23 4
24 4
40 3
33 3
31 3
26 3
37 2
47 2
60 2
50 2
68 1
29 1
28 1
27 1
35 1
36 1
44 1
55 1
41 1
51 1
38 1
39 1
49 1
48 1
72 1
45 1
Name: feat_13, dtype: int64
the value counts of feature feat_14 are:
0 34542
1 9694
2 4693
3 2783
4 2165
5 1634
6 1363
7 1077
8 832
9 648
10 527
11 443
12 294
13 285
14 213
15 168
16 121
17 104
18 97
19 59
20 33
21 31
22 18
23 14
24 12
25 10
26 7
30 5
28 3
33 1
27 1
32 1
Name: feat_14, dtype: int64
the value counts of feature feat_15 are:
0 43770
1 8056
2 3380
3 1916
4 878
5 420
6 332
7 249
18 193
19 191
14 190
21 188
20 186
8 186
17 180
15 174
16 164
22 152
23 152
9 128
13 127
24 127
12 109
10 99
11 95
25 89
26 47
28 36
27 29
29 16
31 7
30 4
32 3
34 2
46 1
33 1
36 1
Name: feat_15, dtype: int64
the value counts of feature feat_16 are:
0 31649
1 10999
2 6652
3 4201
4 2775
5 1797
6 1225
7 808
8 585
9 422
10 285
11 175
12 113
13 85
14 38
15 23
16 11
17 10
18 7
19 5
21 4
20 2
23 2
26 1
37 1
22 1
25 1
27 1
Name: feat_16, dtype: int64
the value counts of feature feat_17 are:
0 51748
1 6086
2 1881
3 762
4 451
5 249
6 160
7 103
8 99
9 77
10 53
11 47
12 24
14 16
19 13
13 13
15 12
18 9
17 7
26 6
31 5
30 5
22 5
27 5
16 5
33 4
29 4
23 3
21 3
25 3
35 3
20 3
24 3
37 2
34 2
28 2
41 2
32 1
36 1
43 1
Name: feat_17, dtype: int64
the value counts of feature feat_18 are:
0 44037
1 10021
2 3876
3 1766
4 935
5 442
6 307
7 166
8 99
10 80
9 45
12 21
11 21
14 13
18 12
16 8
13 6
15 4
20 4
21 3
19 2
27 2
29 1
17 1
22 1
23 1
24 1
25 1
32 1
30 1
Name: feat_18, dtype: int64
the value counts of feature feat_19 are:
0 56122
1 3375
2 841
3 331
4 187
5 126
6 75
7 52
8 43
10 42
9 31
24 23
11 22
20 22
14 21
15 21
30 21
23 19
35 19
27 19
26 18
29 18
12 17
17 17
28 16
21 15
34 14
31 13
19 13
46 13
25 12
16 12
32 12
13 12
22 12
33 11
18 9
37 9
59 9
69 7
39 7
38 7
75 7
36 7
65 7
43 6
48 6
68 6
53 6
51 6
86 6
66 6
73 5
74 5
54 5
40 5
61 5
70 5
41 5
60 4
71 4
76 4
85 4
67 4
97 4
64 4
98 3
93 3
44 3
82 3
81 3
89 3
45 3
52 3
114 3
63 3
56 2
87 2
55 2
84 2
90 2
50 2
95 2
42 2
80 1
49 1
57 1
96 1
88 1
104 1
58 1
72 1
94 1
121 1
116 1
77 1
110 1
92 1
83 1
99 1
102 1
91 1
47 1
79 1
105 1
Name: feat_19, dtype: int64
the value counts of feature feat_20 are:
0 49044
1 7218
2 2480
3 1092
4 615
5 401
6 262
7 189
8 138
9 104
10 84
11 58
12 46
13 39
15 28
14 27
17 16
16 11
18 6
19 6
20 3
22 3
24 3
23 2
26 1
21 1
27 1
Name: feat_20, dtype: int64
the value counts of feature feat_21 are:
0 54544
1 4384
2 1740
3 647
4 287
5 131
6 84
7 26
8 16
10 6
9 6
13 3
12 2
14 1
11 1
Name: feat_21, dtype: int64
the value counts of feature feat_22 are:
0 40873
1 10202
2 5125
3 2584
4 1427
5 661
6 375
7 225
8 150
9 69
13 49
10 48
11 30
12 25
15 10
14 8
16 7
18 7
20 1
17 1
22 1
Name: feat_22, dtype: int64
the value counts of feature feat_23 are:
0 57470
1 2556
2 901
3 434
4 214
5 101
6 68
7 39
8 33
9 17
10 12
14 5
15 5
11 4
18 4
12 3
13 3
16 2
17 2
21 1
19 1
20 1
64 1
54 1
Name: feat_23, dtype: int64
the value counts of feature feat_24 are:
0 22077
1 11905
2 8694
3 4867
4 3660
5 2242
6 1747
7 1282
8 1056
9 819
10 593
11 483
12 354
13 286
14 284
16 173
15 166
19 123
18 116
17 115
20 98
21 97
24 67
22 62
23 59
25 54
27 47
26 40
28 35
29 27
30 26
35 25
32 24
31 20
34 18
36 18
33 16
44 10
42 8
39 8
37 8
40 7
38 7
43 6
46 6
47 6
56 5
41 4
45 4
51 4
53 3
49 2
50 2
59 1
263 1
80 1
137 1
57 1
52 1
62 1
109 1
94 1
158 1
64 1
48 1
63 1
Name: feat_24, dtype: int64
the value counts of feature feat_25 are:
0 27295
1 14799
2 7088
3 4075
4 2865
5 1846
6 1119
7 789
8 554
9 432
10 312
11 189
12 157
13 111
14 79
15 48
16 44
17 24
18 23
19 9
20 6
21 5
23 3
30 2
28 1
22 1
26 1
27 1
Name: feat_25, dtype: int64
the value counts of feature feat_26 are:
0 49180
1 5871
2 2553
3 1367
4 891
5 554
6 368
7 256
8 209
9 141
10 117
11 91
12 59
13 56
14 38
15 28
16 22
17 21
18 11
19 8
20 8
24 5
25 5
21 4
23 4
22 3
26 3
30 2
28 1
33 1
27 1
Name: feat_26, dtype: int64
the value counts of feature feat_27 are:
0 52827
1 2883
2 1721
3 932
4 650
5 453
6 379
7 318
8 276
9 210
10 201
11 151
12 141
13 102
14 92
16 70
15 52
17 49
18 43
20 41
19 32
21 28
22 26
23 22
25 17
24 17
28 15
31 12
27 11
26 11
32 10
33 8
30 8
29 8
36 7
34 6
35 6
47 5
39 5
37 5
43 4
40 3
44 3
38 3
50 2
41 2
58 2
42 2
108 1
123 1
49 1
48 1
45 1
53 1
52 1
Name: feat_27, dtype: int64
the value counts of feature feat_28 are:
0 54009
1 4555
2 1701
3 752
4 409
5 187
6 119
7 52
8 36
9 24
10 15
11 4
14 4
13 3
12 2
20 1
15 1
16 1
17 1
18 1
22 1
Name: feat_28, dtype: int64
the value counts of feature feat_29 are:
0 54521
1 5169
2 1085
3 405
4 163
5 78
6 76
8 54
9 42
7 37
10 28
11 22
13 22
28 20
27 16
16 15
12 13
29 13
18 10
26 10
14 9
30 7
15 6
25 6
58 4
32 4
64 4
57 3
17 3
22 3
65 3
33 3
47 3
34 2
66 2
31 2
67 2
35 1
62 1
49 1
68 1
69 1
38 1
23 1
54 1
21 1
43 1
44 1
50 1
63 1
Name: feat_29, dtype: int64
the value counts of feature feat_30 are:
0 56951
1 3925
2 504
3 291
4 55
5 35
6 28
7 15
11 7
80 4
8 4
9 3
12 3
14 3
63 3
13 2
53 2
60 2
59 2
58 2
10 2
47 2
48 2
49 2
20 2
19 2
17 1
32 1
64 1
62 1
30 1
34 1
61 1
29 1
51 1
26 1
57 1
82 1
42 1
56 1
87 1
84 1
45 1
77 1
23 1
46 1
15 1
81 1
79 1
31 1
21 1
Name: feat_30, dtype: int64
the value counts of feature feat_31 are:
0 57339
1 2880
2 855
3 335
4 165
5 60
6 58
8 33
10 24
7 23
12 21
9 13
14 12
16 12
11 10
18 6
13 6
20 6
15 4
22 3
17 2
19 2
24 2
27 2
40 1
59 1
26 1
32 1
30 1
Name: feat_31, dtype: int64
the value counts of feature feat_32 are:
0 41962
1 7521
2 4123
3 2049
7 1318
4 1235
6 1108
5 840
8 584
9 367
10 307
11 168
12 142
13 50
14 25
15 24
16 13
17 6
19 5
18 4
29 3
34 2
44 2
28 2
20 2
21 2
30 1
35 1
65 1
91 1
59 1
56 1
36 1
149 1
41 1
53 1
52 1
71 1
76 1
31 1
Name: feat_32, dtype: int64
the value counts of feature feat_33 are:
0 39783
1 12020
2 5201
3 2300
4 1142
5 596
6 339
7 201
8 115
9 79
10 50
11 30
13 7
12 5
15 3
16 3
23 1
17 1
19 1
24 1
Name: feat_33, dtype: int64
the value counts of feature feat_34 are:
0 46172
1 9031
2 2432
3 752
4 403
5 295
6 273
8 221
9 217
7 216
10 212
11 184
12 174
13 166
14 136
15 110
16 96
17 88
18 76
19 72
20 67
22 59
25 46
21 45
24 43
26 40
23 40
28 28
27 26
30 20
32 18
29 17
31 14
38 13
33 12
35 9
34 9
39 8
37 7
42 5
36 5
40 5
47 4
41 3
44 3
43 2
48 2
84 1
46 1
Name: feat_34, dtype: int64
the value counts of feature feat_35 are:
0 48050
1 7526
2 2729
3 1264
4 667
5 415
6 257
7 189
8 119
9 89
10 69
12 36
11 30
13 30
14 28
26 26
16 25
18 22
22 20
28 19
20 19
32 18
15 16
24 15
36 14
30 13
34 11
17 11
19 8
42 7
40 7
56 7
52 7
38 7
55 6
50 6
60 6
44 6
23 5
54 5
58 5
48 4
21 4
39 3
33 3
31 3
25 3
46 3
62 3
47 3
53 3
70 2
61 2
64 2
29 2
66 2
45 2
43 2
73 2
80 2
41 2
82 1
27 1
105 1
96 1
92 1
65 1
59 1
72 1
81 1
90 1
89 1
57 1
68 1
100 1
86 1
49 1
78 1
Name: feat_35, dtype: int64
the value counts of feature feat_36 are:
0 46355
1 8041
2 3554
3 1243
4 715
5 380
6 290
7 190
8 179
9 142
10 106
11 78
12 66
14 61
16 54
13 48
15 42
20 32
17 32
18 30
22 28
19 27
24 22
23 17
21 14
25 12
27 11
32 10
31 10
30 8
28 8
29 7
26 7
44 5
34 5
49 4
43 4
35 4
37 4
33 3
53 3
56 2
36 2
57 2
54 2
47 2
41 2
42 2
50 2
45 2
46 2
38 1
40 1
84 1
52 1
60 1
48 1
61 1
Name: feat_36, dtype: int64
the value counts of feature feat_37 are:
0 51480
1 7068
2 2003
3 723
4 330
5 129
6 71
7 30
8 19
9 5
10 5
11 5
14 4
22 2
20 1
12 1
17 1
18 1
Name: feat_37, dtype: int64
the value counts of feature feat_38 are:
0 45783
1 8876
2 3279
3 1552
4 836
5 433
6 331
7 185
8 148
9 86
12 74
10 73
11 57
14 31
13 25
15 20
17 16
16 14
18 12
19 9
20 6
21 5
28 4
22 3
30 3
38 2
23 2
24 2
26 2
27 2
29 2
36 1
25 1
35 1
34 1
39 1
Name: feat_38, dtype: int64
the value counts of feature feat_39 are:
0 53604
1 4804
2 1580
3 575
4 327
5 189
6 97
8 67
7 59
10 33
11 30
9 26
14 23
13 23
12 21
15 20
33 17
16 17
32 15
30 15
35 15
18 15
26 14
17 13
21 13
25 12
22 11
20 11
24 10
36 10
31 10
19 9
38 9
37 9
23 9
27 9
34 8
28 8
56 8
29 6
52 6
48 6
58 6
49 6
41 5
53 5
51 5
50 5
40 5
47 5
43 5
44 4
39 4
59 4
62 4
46 4
67 4
60 4
66 3
61 3
71 3
54 3
68 3
65 3
74 2
69 2
70 2
42 2
75 2
77 2
78 2
55 2
63 2
72 1
57 1
76 1
64 1
Name: feat_39, dtype: int64
the value counts of feature feat_40 are:
0 32593
1 11485
2 5819
3 3553
4 2157
5 1442
6 962
7 687
8 477
9 393
10 340
11 277
12 245
13 227
14 189
15 184
16 151
17 136
18 105
19 97
20 81
21 55
22 42
23 38
26 25
25 23
24 22
27 17
28 13
29 11
32 7
31 5
30 5
34 4
33 3
35 2
36 2
39 1
37 1
38 1
41 1
Name: feat_40, dtype: int64
the value counts of feature feat_41 are:
0 50713
1 7907
2 1922
3 688
4 235
5 107
6 74
8 37
7 29
10 25
9 19
17 14
14 12
23 11
13 11
15 11
11 10
20 9
12 9
21 8
16 7
19 4
18 3
26 3
22 2
25 2
27 2
29 1
36 1
28 1
30 1
Name: feat_41, dtype: int64
the value counts of feature feat_42 are:
0 44327
1 9787
2 3295
3 1444
4 842
5 531
6 392
7 277
8 195
9 173
10 129
11 81
12 71
13 66
14 50
16 31
15 29
18 22
17 21
21 20
19 13
20 12
23 11
22 10
24 9
25 7
26 6
29 6
34 4
28 4
27 3
37 2
35 2
41 1
30 1
36 1
33 1
32 1
31 1
Name: feat_42, dtype: int64
the value counts of feature feat_43 are:
0 51611
1 5780
2 1815
3 727
4 398
5 257
6 210
7 147
8 140
10 131
9 124
11 114
12 87
13 84
14 69
16 42
15 40
17 29
18 17
19 17
21 14
20 11
23 5
22 3
24 3
26 1
42 1
28 1
Name: feat_43, dtype: int64
the value counts of feature feat_44 are:
0 45082
1 8919
2 3986
3 1749
4 909
5 431
6 283
7 135
8 112
9 67
10 53
11 43
12 32
13 18
14 17
17 11
16 10
18 4
19 4
15 3
20 3
23 2
21 2
34 1
22 1
29 1
Name: feat_44, dtype: int64
the value counts of feature feat_45 are:
0 58021
1 2628
2 543
3 201
4 71
5 42
6 41
9 23
8 19
17 16
7 15
18 15
10 15
11 11
34 9
19 9
20 9
12 8
31 7
62 7
24 7
53 7
42 6
58 6
41 6
14 5
36 5
45 5
25 5
29 5
21 5
57 5
40 4
28 4
38 4
37 4
47 4
16 4
64 4
22 4
35 3
23 3
69 3
26 3
60 3
56 3
49 3
63 3
43 3
15 3
51 3
13 3
52 2
30 2
33 2
61 2
48 2
66 2
73 2
54 2
27 2
76 1
44 1
68 1
59 1
46 1
67 1
78 1
39 1
71 1
75 1
50 1
72 1
55 1
80 1
32 1
Name: feat_45, dtype: int64
the value counts of feature feat_46 are:
0 51921
1 4286
2 1429
3 772
4 613
5 468
6 408
7 333
8 267
9 240
10 209
12 162
11 152
13 126
14 99
15 67
16 60
17 49
18 44
19 36
20 24
21 20
22 19
24 15
23 12
25 11
27 8
40 4
26 4
32 4
34 3
29 3
35 2
28 2
33 2
41 1
30 1
36 1
31 1
Name: feat_46, dtype: int64
the value counts of feature feat_47 are:
0 56688
1 2820
2 822
3 400
4 248
5 212
6 134
7 106
8 64
10 60
9 57
11 42
14 28
13 27
15 25
12 25
16 17
17 15
18 14
19 9
20 9
21 7
24 6
23 6
22 5
25 4
28 4
37 3
36 3
29 3
40 2
26 2
27 2
34 2
32 2
31 2
30 1
38 1
47 1
Name: feat_47, dtype: int64
the value counts of feature feat_48 are:
0 28174
1 12801
2 7397
3 4420
4 2876
5 2066
6 1390
7 890
8 603
9 397
10 270
11 138
12 79
14 73
13 66
15 55
16 41
17 16
20 15
21 12
23 10
18 10
33 7
19 6
41 5
39 5
36 5
38 5
22 5
40 4
34 4
48 3
27 3
25 3
32 3
24 3
35 2
31 2
42 2
46 2
26 2
28 2
43 2
37 1
29 1
47 1
49 1
Name: feat_48, dtype: int64
the value counts of feature feat_49 are:
0 51571
1 6343
2 1925
3 744
4 420
5 234
6 188
8 102
7 99
9 49
10 39
12 30
11 26
13 19
14 15
16 13
15 13
22 8
17 8
18 6
19 4
46 3
26 3
32 2
34 2
23 2
47 2
35 1
20 1
81 1
24 1
33 1
25 1
72 1
30 1
Name: feat_49, dtype: int64
the value counts of feature feat_50 are:
0 53492
1 4974
2 1575
3 686
4 354
5 229
6 143
7 76
8 60
9 53
10 35
13 22
11 19
12 18
15 11
20 10
14 10
16 8
17 7
19 6
18 6
24 6
22 6
33 5
25 4
37 4
27 4
26 4
47 4
28 4
23 4
44 3
30 3
39 3
38 3
32 3
35 2
31 2
43 2
21 2
55 1
50 1
49 1
48 1
65 1
45 1
34 1
71 1
57 1
36 1
68 1
53 1
73 1
41 1
54 1
40 1
Name: feat_50, dtype: int64
the value counts of feature feat_51 are:
0 60159
1 1088
2 346
3 107
4 76
5 36
6 17
8 13
7 7
11 6
10 5
9 3
17 3
16 2
18 2
28 2
24 1
12 1
44 1
13 1
14 1
33 1
Name: feat_51, dtype: int64
the value counts of feature feat_52 are:
0 55740
1 3527
2 1241
3 558
4 297
5 162
6 101
7 57
8 49
9 30
12 19
11 17
10 15
13 8
14 8
20 6
17 6
15 5
18 5
16 5
19 4
22 3
26 2
32 1
23 1
25 1
24 1
35 1
39 1
36 1
21 1
38 1
48 1
27 1
44 1
28 1
Name: feat_52, dtype: int64
the value counts of feature feat_53 are:
0 49735
1 8346
2 2073
3 536
4 256
5 177
6 116
7 106
8 71
9 54
21 31
19 25
27 23
24 23
18 23
30 21
22 19
23 19
10 19
11 17
20 17
15 17
26 17
12 16
14 15
28 14
17 14
29 11
13 10
31 10
25 8
16 7
35 6
33 5
32 5
38 4
34 4
37 3
36 2
39 1
40 1
53 1
Name: feat_53, dtype: int64
the value counts of feature feat_54 are:
0 33571
1 8734
2 4912
3 3258
4 2207
5 1754
6 1354
7 1049
8 874
9 691
10 548
11 416
12 383
14 347
13 335
15 241
16 201
18 158
17 143
19 112
20 92
21 75
22 71
23 45
24 39
25 33
26 29
28 29
30 25
27 22
29 17
32 14
34 11
33 9
31 9
40 8
35 8
42 7
43 7
36 7
37 6
41 4
44 4
48 4
63 3
55 2
38 2
39 2
45 2
60 1
54 1
49 1
52 1
Name: feat_54, dtype: int64
the value counts of feature feat_55 are:
0 50509
1 7423
2 2222
3 793
4 325
5 200
6 133
7 94
8 68
12 20
9 18
10 16
11 15
13 11
15 4
18 4
19 4
14 3
16 3
17 3
20 2
21 2
22 2
27 2
26 1
23 1
Name: feat_55, dtype: int64
the value counts of feature feat_56 are:
0 54672
1 4848
2 1263
3 301
4 118
5 70
7 63
6 45
12 39
9 39
11 37
15 35
13 34
10 32
8 31
14 23
16 23
18 22
19 18
17 17
20 16
22 16
24 10
29 9
26 8
33 8
23 7
25 7
31 6
27 6
30 6
21 6
44 5
28 4
32 4
52 3
42 3
48 3
35 2
49 2
53 2
56 2
36 2
60 1
34 1
45 1
38 1
57 1
55 1
62 1
39 1
51 1
43 1
46 1
Name: feat_56, dtype: int64
the value counts of feature feat_57 are:
0 52655
1 5213
2 1892
3 865
4 486
5 274
6 156
7 99
8 83
9 50
10 31
11 18
12 9
14 8
13 5
15 5
18 5
19 4
20 4
24 4
16 3
28 2
22 2
25 2
17 1
23 1
30 1
Name: feat_57, dtype: int64
the value counts of feature feat_58 are:
0 53542
1 3957
2 1196
3 553
4 463
5 316
6 273
7 201
8 163
10 138
9 122
11 92
13 65
12 61
14 53
20 47
15 38
22 35
16 35
17 34
24 29
21 26
23 26
19 26
27 24
18 24
26 22
31 20
33 19
25 19
32 16
30 14
28 12
36 12
35 11
37 9
29 9
45 9
41 8
43 8
34 8
50 8
40 7
51 7
56 7
47 6
46 6
57 6
42 6
60 5
39 5
65 5
64 4
55 4
44 4
48 4
53 4
49 3
70 3
67 3
52 3
38 3
93 3
73 3
59 3
62 2
116 2
84 2
61 2
86 2
83 2
63 2
69 2
71 2
115 1
74 1
99 1
82 1
92 1
66 1
81 1
80 1
117 1
94 1
54 1
58 1
75 1
Name: feat_58, dtype: int64
the value counts of feature feat_59 are:
0 55082
1 4148
2 983
3 435
4 252
5 174
6 108
7 58
8 56
19 38
9 36
12 34
11 30
10 29
14 27
18 26
17 26
13 25
16 23
15 19
29 17
23 15
22 15
32 14
36 13
28 13
26 13
34 12
24 12
21 11
30 11
25 10
20 10
40 8
33 8
38 8
37 7
50 7
35 7
43 6
39 6
27 6
48 4
31 4
46 3
42 3
47 3
61 2
45 2
64 2
53 2
51 2
59 2
44 2
55 1
60 1
49 1
97 1
65 1
57 1
41 1
56 1
63 1
Name: feat_59, dtype: int64
the value counts of feature feat_60 are:
0 46458
1 7982
2 2024
12 1040
3 617
4 560
8 557
13 347
10 346
9 285
6 248
5 227
14 204
16 198
11 192
7 155
17 106
15 86
18 80
19 42
21 29
20 27
22 16
23 9
24 7
25 5
28 5
26 4
27 4
32 4
31 3
39 2
30 2
33 2
40 1
38 1
37 1
36 1
34 1
Name: feat_60, dtype: int64
the value counts of feature feat_61 are:
0 56075
1 2504
2 1281
3 742
4 457
5 274
6 192
7 114
8 90
9 48
10 40
11 22
12 11
13 7
14 5
16 4
17 4
15 3
19 1
38 1
33 1
18 1
23 1
Name: feat_61, dtype: int64
the value counts of feature feat_62 are:
0 35383
1 11923
2 5838
3 3104
4 1856
5 1050
6 667
7 458
8 348
9 238
10 172
11 154
12 119
13 96
14 77
16 76
15 73
17 47
18 38
19 30
54 28
20 22
21 18
23 10
25 9
22 8
24 7
55 5
29 5
56 3
26 3
28 3
41 2
27 2
46 1
42 1
34 1
33 1
32 1
31 1
Name: feat_62, dtype: int64
the value counts of feature feat_63 are:
0 55853
1 4473
2 912
3 254
4 111
6 55
5 49
17 43
7 25
10 24
11 22
9 12
8 9
15 7
12 4
21 3
16 3
19 3
20 2
22 2
18 2
50 2
51 2
23 2
30 1
27 1
26 1
14 1
Name: feat_63, dtype: int64
the value counts of feature feat_64 are:
0 40077
1 11353
2 3696
3 1461
4 818
5 555
6 434
8 397
7 365
10 335
9 328
11 295
13 235
12 222
14 192
15 172
16 170
18 123
17 118
19 92
20 76
23 57
22 54
21 53
25 33
24 33
26 20
27 19
29 14
28 13
30 11
32 9
31 6
35 5
34 5
33 5
43 4
39 3
38 3
36 3
40 2
37 2
41 2
42 2
44 2
55 1
51 1
73 1
72 1
Name: feat_64, dtype: int64
the value counts of feature feat_65 are:
0 52706
1 6639
2 1678
3 485
4 171
5 77
6 49
7 20
8 13
14 7
10 7
9 5
25 3
38 3
12 3
30 2
16 2
28 1
13 1
19 1
20 1
23 1
24 1
26 1
11 1
Name: feat_65, dtype: int64
the value counts of feature feat_66 are:
0 44352
1 9523
2 4165
3 1757
4 865
5 419
6 294
7 156
8 115
9 58
11 53
10 45
12 18
14 9
13 7
19 4
20 4
22 4
16 4
15 3
26 3
21 3
28 2
17 2
18 2
24 2
27 2
33 1
23 1
35 1
25 1
36 1
29 1
30 1
Name: feat_66, dtype: int64
the value counts of feature feat_67 are:
0 23930
1 9878
2 6689
3 4434
4 4116
5 2516
6 2353
7 1649
8 1358
9 964
10 749
11 498
12 394
13 336
14 253
15 202
16 164
17 122
18 115
22 96
19 95
20 94
24 78
21 72
23 70
25 70
28 55
27 53
26 49
29 42
30 37
33 29
36 28
31 27
32 26
35 25
34 24
38 21
37 18
39 17
45 16
40 13
41 12
42 12
48 8
46 7
44 7
49 6
53 6
47 4
52 4
56 3
51 3
54 3
43 3
61 2
60 2
64 2
68 2
55 2
59 2
58 2
50 2
70 1
104 1
89 1
76 1
79 1
62 1
82 1
83 1
63 1
Name: feat_67, dtype: int64
the value counts of feature feat_68 are:
0 53144
1 4930
2 1442
3 638
4 470
5 241
6 157
7 134
9 97
11 86
8 83
10 80
12 79
13 47
16 44
14 43
15 33
17 30
20 14
18 14
19 12
26 10
22 10
23 7
27 7
21 5
25 4
33 3
24 2
35 2
37 2
64 1
32 1
48 1
34 1
36 1
28 1
109 1
29 1
Name: feat_68, dtype: int64
the value counts of feature feat_69 are:
0 53174
1 4262
2 1260
3 563
4 382
5 209
6 157
7 132
8 119
9 107
10 103
12 98
11 91
15 71
14 69
13 67
16 67
22 62
20 53
18 51
17 50
19 49
34 39
24 39
21 38
25 38
28 33
27 32
23 32
32 31
26 31
30 27
41 26
31 25
33 24
40 23
29 23
36 23
37 20
35 18
39 18
45 15
43 14
38 12
48 12
46 11
53 11
42 9
47 9
51 8
44 8
50 5
49 5
55 4
56 3
60 3
59 2
57 2
52 2
62 2
61 1
76 1
64 1
54 1
63 1
Name: feat_69, dtype: int64
the value counts of feature feat_70 are:
0 40241
1 9865
2 4735
3 2589
4 1568
5 962
6 600
7 346
8 250
9 164
10 128
11 109
12 75
14 46
13 40
15 33
16 32
17 23
19 16
18 13
20 10
25 7
24 5
23 4
26 3
21 3
30 2
31 2
22 1
37 1
41 1
46 1
27 1
28 1
32 1
Name: feat_70, dtype: int64
the value counts of feature feat_71 are:
0 51493
1 6698
2 1906
3 783
4 373
5 203
6 103
7 72
8 48
9 33
10 31
13 21
11 14
12 13
17 10
18 10
19 10
21 9
15 8
16 7
14 6
23 6
20 5
25 5
24 4
26 4
28 2
31 1
Name: feat_71, dtype: int64
the value counts of feature feat_72 are:
0 43995
1 9548
2 3123
3 1274
4 815
5 517
6 421
7 364
8 317
9 245
10 235
11 175
12 134
13 128
14 97
16 96
15 89
17 60
18 52
19 51
20 42
21 29
22 23
23 13
24 10
25 8
27 6
28 5
26 4
29 1
30 1
Name: feat_72, dtype: int64
the value counts of feature feat_73 are:
0 51527
1 5693
2 2742
3 597
4 355
5 200
6 128
7 93
8 59
9 44
10 37
11 30
12 25
14 19
15 17
13 15
17 13
22 13
16 11
18 10
20 10
37 9
26 8
19 8
35 7
21 6
31 6
29 6
23 5
40 5
56 5
25 5
27 5
38 5
32 5
46 4
41 4
39 4
63 4
34 4
33 4
50 4
47 4
24 4
30 4
52 4
55 4
89 4
28 4
85 3
48 3
90 3
59 3
44 3
42 3
101 3
36 3
113 3
45 3
58 3
51 2
281 2
91 2
60 2
92 2
352 2
75 2
121 2
115 2
82 2
112 2
76 1
98 1
132 1
68 1
49 1
323 1
131 1
252 1
61 1
253 1
93 1
161 1
65 1
62 1
114 1
160 1
96 1
64 1
158 1
148 1
69 1
311 1
83 1
171 1
139 1
108 1
119 1
170 1
77 1
88 1
137 1
86 1
72 1
80 1
78 1
110 1
71 1
54 1
181 1
111 1
283 1
325 1
165 1
287 1
Name: feat_73, dtype: int64
the value counts of feature feat_74 are:
0 50047
1 6103
2 2778
3 1194
4 595
5 240
6 190
7 140
8 106
9 71
10 57
11 39
12 28
13 19
14 14
16 13
19 12
15 10
17 9
18 9
20 8
25 8
22 8
23 6
62 6
37 5
24 5
34 5
44 5
107 5
47 5
27 5
40 4
41 4
39 4
29 4
33 4
32 4
43 4
28 4
26 4
31 4
45 3
59 3
49 3
81 3
72 3
67 3
21 3
53 3
58 3
52 3
42 2
70 2
38 2
101 2
69 2
36 2
63 2
54 2
92 2
55 2
56 2
30 2
48 2
60 2
87 1
89 1
91 1
35 1
76 1
66 1
180 1
93 1
161 1
125 1
96 1
64 1
94 1
61 1
115 1
139 1
83 1
172 1
77 1
46 1
74 1
79 1
105 1
73 1
80 1
112 1
113 1
145 1
50 1
231 1
103 1
82 1
51 1
102 1
110 1
Name: feat_74, dtype: int64
the value counts of feature feat_75 are:
0 48568
1 8078
2 2022
3 651
4 448
5 260
6 243
7 227
8 170
9 161
10 90
11 89
18 62
12 59
14 55
15 51
17 51
23 47
21 47
16 44
20 43
13 41
24 39
22 39
19 36
26 31
27 30
25 26
29 21
32 11
42 9
36 8
37 8
30 7
33 7
35 7
39 7
41 6
28 6
31 6
44 6
46 6
45 5
34 5
43 4
49 4
53 4
48 3
47 3
52 2
38 2
77 2
59 2
62 2
54 2
71 1
66 1
74 1
64 1
51 1
80 1
61 1
73 1
58 1
76 1
60 1
72 1
40 1
75 1
50 1
Name: feat_75, dtype: int64
the value counts of feature feat_76 are:
0 48487
1 7080
2 2231
3 919
4 583
5 446
6 307
7 279
8 225
9 198
10 156
11 126
12 106
13 101
14 77
16 64
15 63
19 40
18 39
17 39
20 34
21 24
22 24
23 20
24 19
27 17
25 15
29 14
30 13
35 11
32 10
26 9
39 9
34 8
28 8
38 8
31 8
42 6
37 6
36 5
41 5
33 5
44 4
43 3
40 3
45 3
51 2
50 2
65 2
46 2
52 2
57 2
47 1
48 1
59 1
80 1
49 1
102 1
54 1
56 1
55 1
Name: feat_76, dtype: int64
the value counts of feature feat_77 are:
0 58354
1 2280
2 515
4 341
3 125
5 102
6 46
9 36
27 12
11 11
23 10
8 9
22 6
26 6
24 5
25 5
10 4
18 2
28 2
19 2
7 2
12 1
21 1
29 1
Name: feat_77, dtype: int64
the value counts of feature feat_78 are:
0 55574
1 3794
2 1106
3 396
4 246
5 123
6 84
7 51
9 36
8 28
10 27
12 25
14 23
13 19
20 18
25 15
11 15
16 14
15 13
31 12
33 11
22 11
35 11
19 10
34 10
45 9
47 9
36 9
18 9
24 9
43 8
27 8
37 7
28 7
38 7
39 7
41 7
21 7
23 7
17 7
46 6
30 6
29 6
42 5
49 5
26 5
55 4
50 4
44 4
53 4
40 4
58 4
57 4
54 3
66 3
51 2
32 2
48 2
64 2
67 2
71 2
56 2
65 1
62 1
78 1
61 1
69 1
80 1
68 1
59 1
Name: feat_78, dtype: int64
the value counts of feature feat_79 are:
0 54697
1 3654
2 1753
3 505
4 282
12 172
5 154
7 146
6 141
11 113
10 74
9 59
8 52
14 27
13 20
15 6
16 6
19 6
20 5
17 3
21 2
25 1
Name: feat_79, dtype: int64
the value counts of feature feat_80 are:
0 48565
1 6036
2 2278
3 1324
4 964
5 612
6 475
7 325
8 233
9 204
10 156
11 115
12 108
13 73
14 62
15 57
16 54
17 38
19 33
18 29
20 28
21 19
22 14
23 13
24 12
25 7
29 6
27 6
30 5
26 4
28 4
34 4
33 3
31 3
45 2
32 2
37 1
48 1
40 1
46 1
54 1
Name: feat_80, dtype: int64
the value counts of feature feat_81 are:
0 58695
1 2310
2 545
3 168
4 65
5 41
6 23
7 8
9 5
8 4
12 3
13 3
16 2
25 1
10 1
14 1
15 1
18 1
26 1
Name: feat_81, dtype: int64
the value counts of feature feat_82 are:
0 56442
1 2871
2 1181
3 575
4 324
5 198
6 98
7 55
8 45
9 22
10 20
11 15
13 6
15 6
14 5
12 4
16 4
19 2
22 1
17 1
20 1
21 1
24 1
Name: feat_82, dtype: int64
the value counts of feature feat_83 are:
0 53668
1 4047
2 1279
3 656
4 452
5 328
6 197
7 184
10 136
8 126
9 102
11 94
12 77
13 73
14 60
15 53
16 46
17 45
18 41
19 28
21 24
22 19
20 16
30 13
25 12
26 10
23 10
24 9
27 6
36 6
28 5
32 5
29 5
31 4
56 3
46 3
34 3
33 3
37 3
60 2
59 2
40 2
38 2
39 2
54 2
48 2
49 2
63 2
70 1
41 1
44 1
67 1
79 1
65 1
62 1
64 1
51 1
Name: feat_83, dtype: int64
the value counts of feature feat_84 are:
0 60455
1 918
2 247
3 89
4 44
6 21
7 14
5 12
22 6
26 5
38 5
8 5
27 4
29 4
30 4
36 4
35 3
37 3
10 3
31 3
13 2
11 2
28 2
16 2
40 2
17 2
23 2
21 1
33 1
56 1
24 1
55 1
53 1
51 1
20 1
15 1
50 1
39 1
48 1
41 1
42 1
76 1
Name: feat_84, dtype: int64
the value counts of feature feat_85 are:
0 48914
1 7130
2 2532
3 1114
4 673
5 373
6 272
7 154
8 104
9 95
10 88
12 45
13 42
11 41
17 40
14 38
16 34
15 32
19 23
18 22
20 19
24 12
22 12
21 9
23 9
28 7
29 7
26 7
27 4
25 4
30 3
35 3
32 2
31 2
36 2
41 2
48 2
45 2
43 1
34 1
52 1
55 1
Name: feat_85, dtype: int64
the value counts of feature feat_86 are:
0 36516
1 11885
2 5872
3 2694
4 1531
5 771
6 585
7 402
8 322
9 234
10 209
11 151
12 128
13 84
14 71
15 52
16 50
17 32
18 25
20 25
19 24
21 22
25 18
24 18
23 18
22 17
28 16
29 13
31 11
33 10
32 9
27 7
26 7
30 6
34 6
36 6
35 5
38 3
45 3
40 2
42 2
54 2
41 2
37 2
39 2
44 2
49 1
62 1
61 1
65 1
50 1
43 1
Name: feat_86, dtype: int64
the value counts of feature feat_87 are:
0 49859
1 7852
2 2165
3 729
4 436
5 239
6 147
7 84
8 77
9 46
10 34
11 23
15 22
13 19
14 19
16 16
12 14
18 9
17 8
20 7
22 7
19 7
21 6
24 6
33 4
43 4
32 3
42 3
23 3
25 3
26 3
36 3
35 3
30 2
41 2
67 1
29 1
60 1
34 1
28 1
37 1
27 1
38 1
49 1
40 1
54 1
47 1
45 1
31 1
Name: feat_87, dtype: int64
the value counts of feature feat_88 are:
0 41844
1 10010
2 3802
3 1868
4 1163
5 797
6 589
7 435
8 321
9 241
10 193
11 127
13 92
12 90
14 71
15 44
16 38
17 37
18 20
19 18
20 18
21 14
22 14
23 8
24 7
25 4
26 4
27 4
29 3
28 1
30 1
Name: feat_88, dtype: int64
the value counts of feature feat_89 are:
0 48248
1 8112
2 2750
3 1213
4 614
5 251
6 181
8 107
7 106
10 43
17 43
9 43
11 31
18 27
19 22
20 20
12 11
13 7
22 7
14 7
16 5
15 5
21 5
61 2
24 2
23 2
32 2
25 2
30 2
34 1
31 1
38 1
47 1
52 1
55 1
59 1
46 1
Name: feat_89, dtype: int64
the value counts of feature feat_90 are:
0 53542
1 3714
2 1338
3 601
4 444
6 271
5 266
7 162
8 145
10 108
9 102
11 75
13 70
16 60
15 59
14 57
12 57
17 50
19 43
22 39
18 37
20 35
23 33
25 33
21 29
24 28
26 24
36 24
30 21
29 19
47 17
40 17
37 17
27 16
34 16
31 16
33 15
39 15
46 15
35 14
43 13
52 13
38 12
28 12
45 12
44 10
53 10
41 9
32 9
48 8
54 8
42 8
49 8
61 6
58 6
50 5
55 5
62 5
57 5
60 5
59 5
51 4
75 4
78 4
71 4
73 4
63 3
67 3
56 3
80 3
72 3
66 2
77 2
90 2
85 2
79 2
68 1
100 1
92 1
64 1
89 1
65 1
109 1
98 1
130 1
106 1
74 1
70 1
99 1
69 1
127 1
Name: feat_90, dtype: int64
the value counts of feature feat_91 are:
0 57030
1 3022
2 726
3 312
4 179
5 90
6 51
39 41
8 36
7 31
14 28
11 19
21 19
19 19
10 18
12 18
9 18
18 18
22 17
17 17
16 15
20 14
13 12
15 11
23 11
26 11
24 8
27 8
28 8
32 7
35 6
37 6
31 6
29 6
34 5
38 5
33 5
36 5
25 4
41 4
30 3
40 1
52 1
47 1
51 1
48 1
46 1
45 1
42 1
44 1
Name: feat_91, dtype: int64
the value counts of feature feat_92 are:
0 48286
1 8603
2 2877
3 1025
4 461
5 240
6 124
7 92
9 60
8 53
10 16
11 12
12 10
13 7
14 4
15 3
18 2
17 2
19 1
Name: feat_92, dtype: int64
the value counts of feature feat_93 are:
0 58132
1 2606
2 639
3 215
4 90
5 34
6 27
7 15
8 14
9 13
10 10
20 7
12 7
15 6
18 6
11 5
37 4
41 4
13 4
17 4
16 3
19 3
36 3
14 3
39 2
30 2
27 2
26 2
38 2
54 1
87 1
40 1
24 1
25 1
21 1
83 1
50 1
35 1
28 1
34 1
60 1
62 1
63 1
Name: feat_93, dtype: int64
In [8]:
def value_counts_plots(dat,rows = 4, cols = 4):
_,ax = plt.subplots(rows,cols,sharey='row',sharex='col',figsize = (cols*5,rows*5))
for i,feat in enumerate(dat.columns[:(rows*cols)]):
dat[feat].value_counts().iloc[:20].plot(kind = 'bar',ax=ax[i/cols, i%cols],title='value_counts {}'.format(feat))
value_counts_plots(tr_data.iloc[:,1:17],4,4)
In [9]:
cor_mat = tr_data.iloc[:,1:-1].corr()
In [10]:
cor_mat
Out[10]:
feat_1
feat_2
feat_3
feat_4
feat_5
feat_6
feat_7
feat_8
feat_9
feat_10
feat_11
feat_12
feat_13
feat_14
feat_15
feat_16
feat_17
feat_18
feat_19
feat_20
feat_21
feat_22
feat_23
feat_24
feat_25
...
feat_69
feat_70
feat_71
feat_72
feat_73
feat_74
feat_75
feat_76
feat_77
feat_78
feat_79
feat_80
feat_81
feat_82
feat_83
feat_84
feat_85
feat_86
feat_87
feat_88
feat_89
feat_90
feat_91
feat_92
feat_93
feat_1
1.000000
0.031332
-0.027807
-2.752941e-02
0.042973
0.043603
0.298952
0.056321
-0.032285
0.097776
-0.042928
0.056934
0.139254
0.063517
-0.045738
0.027086
0.053004
0.084856
2.302499e-03
0.070511
-0.027026
0.063283
0.048686
0.067255
0.187237
...
0.007544
0.165442
0.013712
-0.029983
0.140815
0.051365
0.011596
0.153808
0.123752
0.279202
0.228912
-0.013303
0.032427
-0.026085
0.059165
0.049634
-0.008739
0.107947
0.089374
0.020830
0.096851
0.010310
0.037264
0.054777
0.081783
feat_2
0.031332
1.000000
0.082573
1.349870e-01
0.020926
0.041343
0.222386
0.019815
-0.025630
0.051925
0.118534
0.090153
0.157467
-0.070057
-0.048798
0.108046
0.074902
0.242716
1.766549e-01
0.449160
0.014113
0.215106
0.162065
0.253684
-0.096366
...
0.307406
0.112968
-0.002336
-0.023267
0.039192
0.070724
0.093689
0.259360
0.014911
0.094256
0.033668
0.155768
0.052101
0.119109
0.371691
0.009845
-0.006764
-0.039090
0.047451
-0.047035
0.105527
0.515022
0.026383
-0.008219
0.054593
feat_3
-0.027807
0.082573
1.000000
5.835232e-01
0.010880
0.004288
0.001294
-0.053462
-0.063551
0.036944
0.596243
0.050037
0.013870
-0.111105
-0.065285
0.221426
-0.023093
0.115655
-1.222845e-02
-0.011069
0.354925
0.251082
-0.002427
-0.031596
-0.157459
...
-0.032748
-0.018774
-0.053020
-0.045339
-0.013972
0.041559
-0.044724
-0.028670
-0.001584
-0.021979
-0.020566
0.442036
0.013089
0.438458
-0.019914
0.011159
-0.048626
-0.096093
-0.009838
-0.082336
0.174781
-0.015068
-0.012417
0.066921
0.006814
feat_4
-0.027529
0.134987
0.583523
1.000000e+00
0.017290
0.014059
0.014490
-0.046184
-0.046250
0.059514
0.389409
0.057434
0.028973
-0.099215
-0.051222
0.211078
-0.007554
0.214895
-3.519107e-07
0.044657
0.232923
0.247738
0.030622
0.003728
-0.134231
...
-0.014461
0.020798
-0.042413
-0.029796
-0.011285
0.049097
-0.031454
-0.013792
0.015318
-0.014499
-0.010835
0.405772
0.028284
0.436541
-0.001052
0.005684
-0.033153
-0.071029
0.005055
-0.067484
0.183715
0.009454
-0.010312
0.087631
0.015746
feat_5
0.042973
0.020926
0.010880
1.729026e-02
1.000000
0.145355
0.075047
0.035861
-0.024708
0.091324
0.004882
0.036668
0.059081
-0.037607
-0.007000
0.062877
0.062197
0.052186
-8.555966e-03
0.046200
0.003288
0.075161
0.017281
0.075222
-0.003610
...
-0.003294
0.118510
0.056428
0.005177
0.001609
0.017265
0.015279
0.035570
0.030462
0.070709
0.055115
0.026223
0.129333
0.057400
0.008006
0.467329
0.034062
0.013879
0.013999
-0.019201
0.119951
0.004842
0.012012
0.065331
0.002038
feat_6
0.043603
0.041343
0.004288
1.405895e-02
0.145355
1.000000
0.088014
0.012867
-0.009373
0.041940
0.014504
0.028588
0.036293
-0.027350
-0.018328
0.021934
0.015488
0.048710
3.849262e-02
0.057813
0.008046
0.038939
0.043651
0.082124
-0.023319
...
0.074836
0.052401
0.011901
-0.011090
0.025023
0.043160
0.006951
0.073867
0.006501
0.061250
0.009942
0.017648
0.044136
0.014907
0.035145
0.177777
0.004290
0.010455
0.015256
-0.015437
0.035042
0.054034
0.012465
0.015479
0.008521
feat_7
0.298952
0.222386
0.001294
1.448981e-02
0.075047
0.088014
1.000000
0.038121
-0.027146
0.194258
0.012418
0.056230
0.199142
-0.044671
-0.035721
0.043957
0.127245
0.098972
5.807104e-02
0.364972
-0.022908
0.162620
0.186462
0.244813
-0.048820
...
0.131430
0.237907
0.115813
-0.014921
0.022819
0.053059
0.039865
0.375114
0.005769
0.567084
0.066753
0.028860
0.144308
0.022059
0.282069
0.062634
0.037874
-0.009169
0.089574
-0.033646
0.063511
0.129578
0.068506
-0.032261
0.034912
feat_8
0.056321
0.019815
-0.053462
-4.618407e-02
0.035861
0.012867
0.038121
1.000000
-0.039281
-0.000023
-0.065923
0.091424
0.095365
-0.061799
-0.056960
-0.004659
0.173912
0.087777
1.938742e-02
0.062595
-0.041095
0.029032
0.012774
0.161848
-0.036939
...
0.046258
0.023089
0.081664
-0.029868
0.028999
-0.000431
0.031466
0.081682
0.027486
0.079623
0.083714
-0.038382
0.035102
-0.034409
0.033479
0.005064
-0.003416
-0.029395
0.059929
-0.050931
0.007974
0.026807
0.095990
0.013608
0.005131
feat_9
-0.032285
-0.025630
-0.063551
-4.624977e-02
-0.024708
-0.009373
-0.027146
-0.039281
1.000000
-0.024323
-0.075820
-0.021885
-0.040164
-0.110188
0.009858
-0.082664
-0.028709
-0.043642
-1.671680e-04
-0.023397
-0.028409
-0.062348
0.006940
0.073618
-0.025279
...
-0.029335
-0.056205
0.043286
-0.058147
0.022679
0.007594
-0.027313
-0.027424
-0.020185
-0.015922
-0.036116
-0.046721
-0.005847
-0.039806
-0.032875
-0.013569
-0.031462
-0.019144
-0.016925
0.001160
-0.019147
-0.020698
-0.014742
-0.069707
-0.006038
feat_10
0.097776
0.051925
0.036944
5.951396e-02
0.091324
0.041940
0.194258
-0.000023
-0.024323
1.000000
0.006010
0.048969
0.086682
-0.029598
-0.021700
0.063997
0.092959
0.071635
9.015307e-03
0.176373
-0.005134
0.141405
0.096666
0.081684
0.009792
...
0.077354
0.322857
0.104834
0.004225
0.000240
0.008912
0.003828
0.106752
0.019069
0.091760
0.113659
0.019042
0.135928
0.029741
0.052025
0.017939
0.086758
0.159447
0.077421
0.054635
0.061498
0.049908
0.024025
-0.006869
0.041316
feat_11
-0.042928
0.118534
0.596243
3.894092e-01
0.004882
0.014504
0.012418
-0.065923
-0.075820
0.006010
1.000000
0.053391
0.005500
-0.151635
-0.100255
0.134924
-0.031807
0.079620
3.367902e-02
-0.005641
0.324080
0.154861
0.026500
0.064213
-0.206207
...
0.010697
-0.050540
-0.071717
-0.086496
-0.006893
0.186561
-0.034513
0.021357
-0.008064
-0.009866
-0.024491
0.561843
0.004795
0.420361
-0.000190
0.017724
-0.074293
-0.123339
-0.032969
-0.114491
0.137374
0.045074
-0.029511
0.013179
0.003326
feat_12
0.056934
0.090153
0.050037
5.743356e-02
0.036668
0.028588
0.056230
0.091424
-0.021885
0.048969
0.053391
1.000000
0.115288
-0.021695
0.028407
0.113804
0.031434
0.100066
3.474800e-02
0.071224
0.048215
0.119511
0.070561
0.085558
-0.012281
...
0.057074
0.071630
0.028629
0.031888
0.181833
0.087012
0.052536
0.113921
0.021418
0.026525
0.036666
0.064162
0.052954
0.054571
0.036529
0.009807
0.019283
-0.007214
0.016089
-0.024324
0.082220
0.062721
0.063965
0.063922
0.012722
feat_13
0.139254
0.157467
0.013870
2.897317e-02
0.059081
0.036293
0.199142
0.095365
-0.040164
0.086682
0.005500
0.115288
1.000000
-0.021555
-0.032725
0.087992
0.319352
0.177439
4.295357e-02
0.194816
-0.005092
0.209427
0.056463
0.192249
0.006019
...
0.086611
0.145294
0.052337
-0.008543
0.030927
0.021072
0.022496
0.189751
0.178263
0.224472
0.233604
0.010431
0.065423
0.010074
0.115747
0.023221
0.002594
0.004850
0.093870
-0.036259
0.062990
0.107722
0.044338
0.071953
0.038989
feat_14
0.063517
-0.070057
-0.111105
-9.921490e-02
-0.037607
-0.027350
-0.044671
-0.061799
-0.110188
-0.029598
-0.151635
-0.021695
-0.021555
1.000000
-0.039565
0.018640
-0.040148
-0.076035
-4.274776e-02
-0.060775
-0.005645
-0.063087
-0.046284
-0.118920
0.557929
...
-0.058352
0.054226
-0.076935
-0.022358
-0.040175
-0.033498
-0.079099
-0.058841
0.007714
-0.041530
-0.045473
-0.104507
-0.037389
-0.084089
-0.070643
-0.027058
-0.021455
0.145787
-0.020229
0.323089
-0.038881
-0.060240
-0.038444
-0.040133
-0.018127
feat_15
-0.045738
-0.048798
-0.065285
-5.122155e-02
-0.007000
-0.018328
-0.035721
-0.056960
0.009858
-0.021700
-0.100255
0.028407
-0.032725
-0.039565
1.000000
0.344157
-0.045991
-0.044733
-3.202887e-02
-0.034214
0.086686
0.056194
-0.029100
-0.062806
0.008228
...
-0.052924
0.118795
-0.033807
0.764664
-0.019194
-0.013632
-0.048894
-0.061486
-0.016673
-0.027200
-0.039033
-0.057948
-0.019557
-0.050702
-0.052090
-0.009311
0.246847
-0.002529
-0.023191
0.010840
0.029547
-0.046616
-0.034402
-0.018206
-0.020369
feat_16
0.027086
0.108046
0.221426
2.110780e-01
0.062877
0.021934
0.043957
-0.004659
-0.082664
0.063997
0.134924
0.113804
0.087992
0.018640
0.344157
1.000000
0.014567
0.244849
-1.011014e-03
0.088638
0.199338
0.433154
0.037233
-0.016491
0.039487
...
-0.021699
0.255740
0.001594
0.394998
-0.026285
0.034125
-0.059511
-0.015473
0.129884
-0.003965
0.045703
0.205393
0.076330
0.168008
0.017558
0.035211
0.110850
0.003610
0.077770
-0.007257
0.248364
0.016863
0.048494
0.210499
0.031467
feat_17
0.053004
0.074902
-0.023093
-7.553867e-03
0.062197
0.015488
0.127245
0.173912
-0.028709
0.092959
-0.031807
0.031434
0.319352
-0.040148
-0.045991
0.014567
1.000000
0.154259
3.731379e-02
0.261954
-0.040070
0.157098
0.133157
0.257106
-0.034996
...
0.100656
0.108789
0.150286
-0.023103
0.002826
0.001770
0.025506
0.134841
0.099665
0.069539
0.114670
-0.015193
0.123831
-0.012734
0.113987
0.000334
0.015559
0.049102
0.214221
-0.034139
0.035390
0.045218
0.088508
-0.006538
0.056695
feat_18
0.084856
0.242716
0.115655
2.148952e-01
0.052186
0.048710
0.098972
0.087777
-0.043642
0.071635
0.079620
0.100066
0.177439
-0.076035
-0.044733
0.244849
0.154259
1.000000
3.652979e-02
0.204010
0.043272
0.364125
0.098466
0.134752
-0.063139
...
0.071926
0.169238
0.074981
-0.002190
-0.008668
0.038278
-0.011790
0.059279
0.102684
0.042958
0.080390
0.182025
0.135914
0.156176
0.085116
0.028752
-0.001555
-0.029295
0.126886
-0.035981
0.247462
0.094336
0.037275
0.126640
0.058100
feat_19
0.002302
0.176655
-0.012228
-3.519107e-07
-0.008556
0.038493
0.058071
0.019387
-0.000167
0.009015
0.033679
0.034748
0.042954
-0.042748
-0.032029
-0.001011
0.037314
0.036530
1.000000e+00
0.105428
-0.024122
0.019088
0.100028
0.290475
-0.065112
...
0.397621
0.031171
-0.020871
-0.022584
0.056837
0.022614
0.036357
0.179548
-0.007951
0.006204
0.012499
-0.002391
0.000812
-0.006879
0.056721
-0.002847
-0.008292
-0.014560
0.000412
-0.018485
0.011116
0.450925
0.004085
-0.027662
0.014243
feat_20
0.070511
0.449160
-0.011069
4.465657e-02
0.046200
0.057813
0.364972
0.062595
-0.023397
0.176373
-0.005641
0.071224
0.194816
-0.060775
-0.034214
0.088638
0.261954
0.204010
1.054279e-01
1.000000
-0.037885
0.244537
0.225089
0.344270
-0.070439
...
0.348327
0.316436
0.098633
0.012246
0.029960
0.033726
0.053898
0.359596
0.017077
0.196400
0.074068
0.025698
0.144703
0.023469
0.364803
0.001723
0.084570
0.016850
0.220475
0.004081
0.111231
0.370282
0.079181
-0.018715
0.110054
feat_21
-0.027026
0.014113
0.354925
2.329227e-01
0.003288
0.008046
-0.022908
-0.041095
-0.028409
-0.005134
0.324080
0.048215
-0.005092
-0.005645
0.086686
0.199338
-0.040070
0.043272
-2.412224e-02
-0.037885
1.000000
0.118657
-0.006174
-0.041367
-0.046066
...
-0.044378
-0.007589
-0.040361
0.072693
-0.009870
0.123215
-0.047657
-0.041637
0.000132
-0.027742
-0.016071
0.279618
-0.004052
0.221444
-0.031940
-0.004702
-0.006180
-0.045562
-0.016862
-0.030401
0.105392
-0.033193
-0.019779
0.058008
-0.007677
feat_22
0.063283
0.215106
0.251082
2.477378e-01
0.075161
0.038939
0.162620
0.029032
-0.062348
0.141405
0.154861
0.119511
0.209427
-0.063087
0.056194
0.433154
0.157098
0.364125
1.908832e-02
0.244537
0.118657
1.000000
0.084268
0.088580
-0.047433
...
0.045054
0.231848
0.072454
0.113100
-0.012546
0.040243
-0.020839
0.060522
0.132590
0.084217
0.093992
0.189565
0.155515
0.177255
0.083639
0.041519
0.044396
-0.018347
0.219974
-0.045439
0.244779
0.098595
0.104921
0.200593
0.113276
feat_23
0.048686
0.162065
-0.002427
3.062225e-02
0.017281
0.043651
0.186462
0.012774
0.006940
0.096666
0.026500
0.070561
0.056463
-0.046284
-0.029100
0.037233
0.133157
0.098466
1.000276e-01
0.225089
-0.006174
0.084268
1.000000
0.211602
-0.033920
...
0.143339
0.137402
0.056965
-0.012941
0.098657
0.085407
0.026762
0.208441
0.010125
0.091219
0.039133
0.037434
0.116575
0.028166
0.181739
0.011832
0.056994
0.121170
0.111837
-0.014039
0.059743
0.141869
0.010438
-0.031837
0.084945
feat_24
0.067255
0.253684
-0.031596
3.727726e-03
0.075222
0.082124
0.244813
0.161848
0.073618
0.081684
0.064213
0.085558
0.192249
-0.118920
-0.062806
-0.016491
0.257106
0.134752
2.904751e-01
0.344270
-0.041367
0.088580
0.211602
1.000000
-0.127708
...
0.419143
0.098407
0.123829
-0.050412
0.048055
0.113057
0.156134
0.490795
0.033078
0.223999
0.083032
0.024981
0.084692
-0.016468
0.300629
0.091092
-0.018990
0.015444
0.123298
-0.043479
0.023581
0.357270
0.090833
-0.024375
0.089200
feat_25
0.187237
-0.096366
-0.157459
-1.342306e-01
-0.003610
-0.023319
-0.048820
-0.036939
-0.025279
0.009792
-0.206207
-0.012281
0.006019
0.557929
0.008228
0.039487
-0.034996
-0.063139
-6.511212e-02
-0.070439
-0.046066
-0.047433
-0.033920
-0.127708
1.000000
...
-0.091005
0.126664
-0.012005
0.015754
-0.041218
-0.040263
-0.059513
-0.080761
0.093776
-0.038806
0.092507
-0.138356
-0.021283
-0.114821
-0.092796
-0.018320
0.021119
0.263924
-0.011294
0.207974
-0.012866
-0.088187
-0.045759
0.030135
-0.015708
feat_26
-0.022813
0.064856
0.268112
3.657567e-01
0.025116
0.004680
-0.008782
-0.041599
-0.066414
-0.003721
0.191095
0.042363
0.016335
-0.101571
-0.056035
0.187627
-0.004430
0.234748
-2.116921e-02
-0.004728
0.113374
0.234156
-0.006505
0.037106
-0.124594
...
-0.042414
-0.012608
-0.031770
-0.046341
-0.018681
0.036063
-0.022976
-0.039677
0.001871
-0.019367
-0.012442
0.206267
0.028376
0.154009
0.001433
0.048907
-0.048889
-0.072464
0.015937
-0.078470
0.094521
-0.021565
-0.018447
0.199974
0.016709
feat_27
-0.038826
0.037841
0.508370
3.086287e-01
0.002098
0.001943
-0.015429
-0.050272
-0.042531
-0.001551
0.599484
0.041377
-0.012657
-0.093146
-0.045859
0.118580
-0.036484
0.043358
-1.895432e-02
-0.038583
0.376264
0.112099
-0.002618
-0.017972
-0.130455
...
-0.037023
-0.048459
-0.050763
-0.043237
-0.005898
0.134597
-0.040042
-0.028521
-0.007386
-0.020715
-0.027651
0.430435
-0.004720
0.321354
-0.024884
0.002502
-0.046053
-0.082510
-0.028097
-0.070194
0.099536
-0.025263
-0.018778
0.023790
0.000318
feat_28
-0.030257
0.072494
0.551398
4.864171e-01
0.047688
0.017132
0.000998
-0.036668
-0.055545
0.022349
0.422271
0.053406
0.015961
-0.097257
-0.044554
0.235443
-0.020020
0.163886
-1.885817e-02
-0.003058
0.255093
0.253812
-0.003651
-0.021632
-0.126680
...
-0.037170
0.001360
-0.038653
-0.023556
-0.013072
0.043981
-0.039447
-0.029717
0.002422
-0.020470
-0.013202
0.375019
0.048691
0.404849
-0.020382
0.029148
-0.039784
-0.080806
-0.002941
-0.074442
0.169794
-0.021330
-0.015242
0.122653
0.005275
feat_29
0.069266
0.025689
-0.004141
1.427066e-02
0.065957
0.002389
0.046231
0.104985
-0.021328
0.068243
-0.021024
0.063105
0.177028
-0.027883
-0.022266
0.142485
0.135787
0.118086
-1.078641e-02
0.059671
-0.014585
0.235942
0.006970
0.039872
0.033676
...
-0.002000
0.079299
0.050342
0.010273
-0.001577
0.005190
0.016475
0.012754
0.612847
0.026352
0.050629
-0.008607
0.030520
-0.003870
0.015374
0.007335
0.013104
-0.011960
0.038800
-0.032585
0.055398
-0.000185
0.040526
0.084445
0.008301
feat_30
0.033108
0.026896
-0.007667
-8.733991e-04
0.318117
0.196493
0.050535
0.009574
-0.015830
0.012623
0.008832
0.013299
0.042503
-0.028994
-0.007801
0.025419
0.023838
0.018357
6.245008e-03
0.015289
-0.004026
0.015912
0.042710
0.133829
-0.021807
...
0.009822
0.053508
0.026446
-0.006538
0.073560
0.044089
0.121927
0.073491
0.015577
0.025439
0.003858
0.004286
0.095881
0.019710
0.015350
0.716862
-0.011743
-0.003346
0.024016
-0.024413
0.028606
0.016571
0.016378
-0.002712
0.006260
feat_31
-0.011210
0.193216
0.138548
3.512627e-01
0.013188
0.017112
0.032291
-0.018342
-0.016227
0.038503
0.109761
0.056052
0.024556
-0.052896
-0.029179
0.125445
0.028863
0.234090
3.255077e-02
0.123438
0.090587
0.148829
0.123996
0.060356
-0.067072
...
0.039048
0.076045
-0.005662
-0.016424
0.003859
0.128573
-0.008119
0.027571
0.016217
0.002431
0.006050
0.241754
0.045104
0.191724
0.049387
0.004130
0.012365
-0.030215
0.028640
-0.030582
0.147243
0.068371
-0.008808
0.022346
0.030656
feat_32
0.061361
0.087699
-0.053753
-3.673037e-02
0.122957
0.103699
0.149537
0.017910
0.568417
0.028812
-0.002610
0.017584
0.072300
-0.141330
-0.029320
-0.025200
0.100666
0.018512
1.155059e-01
0.162073
-0.041779
0.023853
0.133619
0.406661
-0.062620
...
0.112176
0.076867
0.073278
-0.060626
0.049874
0.069656
0.100877
0.227310
0.053867
0.153970
0.056263
-0.014739
0.079886
-0.021769
0.112246
0.286689
-0.000020
0.055348
0.172447
-0.044493
0.014314
0.122931
0.008934
-0.066435
0.130842
feat_33
0.049454
-0.033927
-0.078520
-5.481916e-02
0.006046
-0.019617
0.008628
0.045822
-0.045937
-0.003292
-0.130168
0.048145
0.048620
0.258664
0.273124
0.239344
0.049150
0.033070
-4.092405e-02
0.043688
0.027701
0.083295
-0.020804
-0.042170
0.228878
...
-0.038998
0.119803
-0.002155
0.303326
-0.031310
-0.024346
-0.047442
-0.031347
0.063717
-0.027899
0.035588
-0.068258
0.019694
-0.048967
-0.034326
-0.005488
0.082534
0.038480
0.030021
0.098167
0.055363
-0.048377
0.003487
0.099836
-0.012546
feat_34
-0.042756
-0.026717
-0.039765
-2.908729e-02
-0.001032
-0.010199
-0.022144
0.015314
-0.050045
-0.030941
-0.068741
0.018258
0.004043
-0.094934
-0.049157
-0.047032
-0.009384
-0.020361
-2.231114e-02
-0.038096
-0.026739
-0.043086
-0.017839
0.017694
-0.116002
...
-0.026018
-0.065941
0.003020
-0.056553
-0.011981
-0.020616
-0.041175
-0.033004
-0.010891
-0.020402
-0.012131
-0.030723
-0.012386
-0.031212
-0.012072
-0.007639
-0.047765
-0.080038
-0.019620
-0.084706
-0.029953
-0.028500
0.006878
0.076567
-0.015377
feat_35
-0.010667
0.293374
0.233923
5.544741e-01
0.027013
0.015739
0.043001
-0.024043
-0.026879
0.051793
0.108392
0.077834
0.050881
-0.063796
-0.033329
0.208736
0.026671
0.380595
4.061698e-02
0.153165
0.093071
0.255825
0.075955
0.049722
-0.083226
...
0.032932
0.090731
-0.024095
-0.014162
-0.001981
0.073560
-0.009315
0.025874
0.014190
0.000374
0.002152
0.274259
0.052736
0.253594
0.035014
0.003819
0.005873
-0.041072
0.023173
-0.034481
0.192518
0.099522
-0.009463
0.064610
0.024180
feat_36
0.095475
0.026988
-0.034305
-2.965968e-02
0.043504
0.014673
0.079760
0.606707
-0.035874
0.007687
-0.047595
0.063250
0.128211
-0.042550
-0.058118
-0.012829
0.199284
0.076773
-1.099318e-02
0.079299
-0.036644
0.044453
0.018374
0.193335
-0.034228
...
0.017015
0.021627
0.060492
-0.035706
0.018678
0.021245
0.051641
0.076780
0.032473
0.152192
0.073767
-0.023800
0.021000
-0.023039
0.036176
0.011182
-0.027171
-0.029148
0.109157
-0.037274
0.010805
0.008579
0.066842
-0.000525
0.032804
feat_37
0.082306
0.124475
-0.029425
-1.064590e-02
0.035320
0.043213
0.141960
0.119417
-0.024868
0.187069
-0.011404
0.080605
0.077847
-0.038171
-0.040783
0.040773
0.079853
0.062521
6.762914e-02
0.196200
-0.006567
0.065120
0.117174
0.186706
-0.030234
...
0.122514
0.184344
0.042454
-0.022233
0.169925
0.086901
0.145731
0.201531
0.045286
0.136333
0.045819
0.014978
0.048283
-0.007138
0.115052
0.013236
0.160492
0.005594
0.095706
-0.015467
0.031177
0.174533
0.031573
-0.011842
0.073010
feat_38
0.104666
0.373022
0.046097
9.248764e-02
0.070100
0.078088
0.315177
0.034287
-0.028766
0.177524
0.055483
0.097602
0.191271
-0.061308
0.001730
0.193696
0.136263
0.248949
2.468113e-01
0.489655
0.014056
0.253535
0.229581
0.298310
-0.054273
...
0.301431
0.409175
0.060743
0.060632
0.023959
0.048949
0.046876
0.344171
0.025131
0.135069
0.067772
0.109521
0.179604
0.096153
0.215912
0.033801
0.181822
0.047868
0.166591
-0.010355
0.263962
0.356354
0.027772
-0.005320
0.083854
feat_39
0.006898
0.020626
-0.016279
-2.699561e-03
-0.000710
0.033470
0.053230
0.071908
-0.013354
0.042223
-0.013556
0.008417
0.015840
-0.001582
-0.003620
0.043227
0.061983
0.058047
6.468167e-03
0.085462
-0.015944
0.052726
0.097809
0.181969
-0.024830
...
0.030387
0.105113
0.058650
0.000943
0.012927
0.022546
0.008305
0.077277
0.058445
0.018723
0.035878
-0.001440
0.081091
-0.006898
0.050781
-0.003668
0.042419
-0.001503
0.070500
-0.024618
0.017853
0.019496
0.033654
-0.028631
0.016089
feat_40
0.053603
-0.057890
-0.094511
-7.680452e-02
-0.025652
-0.018090
-0.039712
-0.044847
-0.085491
-0.038610
-0.133424
0.016455
-0.036971
0.487804
0.069358
0.287290
-0.071674
-0.024125
-4.258767e-02
-0.062804
0.041815
0.018002
-0.032760
-0.130764
0.417171
...
-0.065651
0.090232
-0.068913
0.102698
-0.034949
-0.002428
-0.082554
-0.070334
0.017066
-0.022609
-0.043747
-0.065813
-0.031681
-0.068693
-0.063668
-0.022284
-0.002764
0.050871
-0.030951
0.162950
0.056494
-0.057816
-0.036275
0.031490
-0.023675
feat_41
0.080960
0.072120
-0.028920
-9.976832e-03
0.038764
0.028930
0.097971
0.075854
-0.023446
0.078533
-0.038465
0.064854
0.123191
-0.029650
-0.026326
0.079613
0.153653
0.104176
2.565826e-02
0.190383
-0.018627
0.128977
0.021231
0.110562
0.023498
...
0.084362
0.126442
0.053532
0.003314
0.004248
0.007097
0.071319
0.121279
0.072372
0.011902
0.127535
-0.003674
0.070796
-0.005659
0.056999
0.002849
-0.004431
0.000603
0.045929
-0.031026
0.100392
0.064128
0.071538
0.159939
0.007970
feat_42
0.073611
0.043863
0.036605
7.139722e-02
0.023272
0.020709
-0.001220
-0.006578
-0.055850
0.024055
-0.002356
0.028966
0.036274
-0.054232
-0.045392
0.142924
0.045206
0.229794
-2.476852e-02
0.039543
-0.009244
0.238647
-0.001777
0.015382
-0.012705
...
-0.035006
0.072523
0.004829
-0.024687
-0.022221
0.007210
-0.032157
-0.029785
0.033537
-0.016144
0.104605
0.028310
0.044863
0.003465
0.032126
0.011315
-0.001051
-0.029038
0.145685
-0.045588
0.066354
-0.011508
0.020277
0.197315
0.095720
feat_43
0.008884
0.017600
-0.014068
-8.930611e-04
0.048320
0.015515
0.010259
-0.004214
-0.046033
0.111706
-0.027301
0.007790
-0.026543
0.073237
0.010929
0.046106
-0.018660
0.007073
-1.446902e-02
0.043772
0.001126
-0.009844
0.039410
-0.047241
0.084195
...
-0.025677
0.321611
-0.032762
0.026895
-0.012752
0.022302
-0.034776
-0.022225
-0.005309
-0.015939
-0.024274
0.003795
-0.002574
-0.009240
0.008312
-0.002255
0.415152
0.109338
0.007965
0.158878
0.049449
-0.014281
-0.019450
-0.040629
-0.000640
feat_44
0.090439
0.208450
0.242901
2.386655e-01
0.177137
0.065082
0.213066
0.001733
-0.062023
0.222353
0.172523
0.091486
0.162659
-0.061226
0.043123
0.353285
0.085796
0.265708
4.202058e-02
0.239186
0.094507
0.416059
0.115325
0.086296
-0.027719
...
0.033513
0.384838
0.073183
0.098485
-0.005774
0.036903
-0.014398
0.074762
0.072331
0.093156
0.110654
0.179266
0.202801
0.182162
0.084283
0.118820
0.104923
0.049617
0.165350
-0.010699
0.205417
0.080785
0.001806
0.073727
0.105498
feat_45
0.008601
0.018259
-0.007984
-2.809149e-03
-0.004808
0.025133
0.036262
0.076982
-0.013405
0.009960
-0.006148
0.004838
0.011546
0.015501
0.018245
0.041837
0.028681
0.038576
6.415927e-03
0.046258
-0.011902
0.034295
0.079246
0.150948
-0.014604
...
0.014862
0.066091
0.022953
0.008630
0.018758
0.022145
0.015586
0.077474
0.063837
0.025537
0.031569
0.000147
0.026733
-0.002928
0.045234
-0.002054
0.026840
-0.007039
0.051954
-0.018612
0.007262
0.017729
0.033762
-0.019553
0.014994
feat_46
-0.040647
0.042491
0.777517
4.362735e-01
-0.003583
0.001892
-0.012643
-0.050246
-0.053175
0.013546
0.579272
0.042124
-0.008579
-0.099296
-0.054633
0.179348
-0.033466
0.059749
-1.880024e-02
-0.032060
0.381266
0.171096
-0.011156
-0.034435
-0.142185
...
-0.036424
-0.040795
-0.049396
-0.042077
-0.010842
0.047543
-0.043645
-0.033789
-0.008934
-0.022669
-0.031722
0.403677
0.000241
0.363145
-0.027402
-0.001093
-0.049848
-0.092302
-0.019250
-0.077723
0.114977
-0.026419
-0.022626
0.049284
-0.004403
feat_47
0.026463
0.014574
-0.009560
-1.291450e-02
0.111428
0.018736
0.177082
0.015016
-0.027254
0.046108
-0.004880
0.018930
0.083185
-0.059736
-0.043214
0.038439
0.017601
0.072757
-1.206439e-02
0.103850
-0.019792
0.135443
0.059817
0.079616
-0.029371
...
-0.007611
0.054501
0.392560
-0.031692
0.013389
0.002744
0.004454
0.066191
-0.008686
0.193238
0.023517
0.010398
0.151613
0.007708
0.037423
0.105327
0.024970
-0.033694
0.030387
-0.042558
0.139668
-0.013074
-0.008806
0.016421
-0.000629
feat_48
0.130564
0.023310
-0.111987
-8.578789e-02
0.013228
0.013193
0.056163
0.104416
0.077385
-0.008297
-0.125103
0.035591
0.089556
0.252006
-0.051239
-0.030415
0.102710
0.019632
3.839369e-02
0.049880
-0.050541
0.002619
0.061294
0.192535
0.255795
...
0.041509
0.049905
0.030908
-0.036547
0.084806
0.016403
0.050548
0.125008
0.090005
0.074560
0.057616
-0.079256
0.006612
-0.073208
0.049315
0.006009
-0.040097
0.109048
0.099603
0.074914
0.009900
0.070611
0.047508
-0.023148
0.053749
feat_49
0.011179
0.311278
0.242692
3.000109e-01
0.054311
0.022864
0.124320
-0.019114
-0.042280
0.079449
0.250991
0.073520
0.075534
-0.070134
-0.040979
0.227799
0.049887
0.346609
1.960682e-02
0.194854
0.093235
0.307542
0.090079
0.045497
-0.088119
...
0.031680
0.139793
-0.007692
-0.009323
-0.001989
0.034674
-0.006816
0.047059
0.023213
0.023846
0.041352
0.265545
0.135250
0.343092
0.079155
0.036612
0.006069
-0.043704
0.059635
-0.045283
0.217931
0.075120
-0.006273
0.037485
0.029427
feat_50
0.040114
0.071664
-0.016765
5.343432e-03
0.024087
0.027150
0.140433
0.001231
0.009925
0.252350
-0.019520
0.041978
0.043774
-0.014922
-0.020619
0.058954
0.081685
0.074758
5.629922e-02
0.240263
-0.020847
0.122040
0.152904
0.098027
0.001294
...
0.072500
0.347168
0.104623
0.016511
0.009222
0.016972
0.015501
0.088080
0.005459
0.051061
0.073377
0.001354
0.173281
-0.001742
0.065188
-0.003742
0.184132
0.132234
0.125120
0.042334
0.076471
0.080146
0.008383
-0.040728
0.052934
feat_51
0.000902
0.085345
0.031979
4.164659e-02
0.060318
0.006902
0.036493
0.009513
-0.013181
0.014012
0.040465
0.016439
0.021930
-0.041262
-0.024504
0.024675
0.076780
0.114754
4.867697e-03
0.073081
0.020342
0.056819
0.085160
0.053253
-0.043517
...
0.012447
0.016843
0.027463
-0.019303
0.005856
0.022998
0.013396
0.026127
0.015799
0.028184
0.026622
0.093025
0.107963
0.118212
0.043710
0.079998
-0.014491
-0.025348
0.081376
-0.026219
0.063843
0.017031
0.000449
0.013779
0.017484
feat_52
-0.013082
0.105528
0.169111
2.095170e-01
0.003930
0.030929
0.006189
-0.028689
0.009792
0.007879
0.272738
0.068520
0.009553
-0.060335
-0.022071
0.093180
-0.004599
0.107743
1.712624e-02
0.033363
0.230007
0.081117
0.071561
0.064581
-0.073940
...
0.010622
0.001866
-0.018361
-0.024320
0.007225
0.554737
-0.012264
0.012416
-0.001000
-0.006928
0.006210
0.366544
0.019771
0.189014
0.005840
0.008737
-0.010815
-0.040474
-0.003709
-0.041361
0.115156
0.043089
-0.012082
0.026573
0.014812
feat_53
0.156400
-0.008015
-0.024222
-1.630815e-02
0.040202
0.005914
0.048544
0.018674
-0.030075
0.082810
-0.052811
0.018344
0.161234
0.006808
-0.006163
0.086820
0.018729
0.042412
-1.375631e-02
0.028131
0.002909
0.089161
-0.003534
0.021068
0.149136
...
-0.012123
0.104217
0.013776
0.017736
-0.008826
0.000711
-0.011755
0.011055
0.111723
0.051046
0.349737
-0.020713
0.029944
-0.018108
-0.006772
0.005698
0.010533
0.076241
0.004038
-0.015673
0.081352
-0.016199
-0.004092
0.135595
-0.006926
feat_54
0.001399
0.179173
0.694048
5.254557e-01
0.055299
0.031379
0.079607
-0.025421
-0.078499
0.074637
0.494180
0.102486
0.119941
-0.112305
0.034175
0.468766
0.033721
0.272898
2.603274e-04
0.096978
0.318332
0.502168
0.036249
0.016601
-0.136167
...
-0.007706
0.127314
-0.019349
0.064084
-0.016801
0.066009
-0.050655
0.008459
0.058522
0.019843
0.025234
0.444238
0.082381
0.394613
0.026663
0.026109
0.004352
-0.081509
0.057848
-0.083348
0.256305
0.034802
0.018972
0.153724
0.029959
feat_55
0.165044
0.134617
0.055445
7.007713e-02
0.142221
0.062765
0.343752
0.063415
-0.042138
0.213003
0.022723
0.081504
0.223364
-0.055732
-0.040297
0.131218
0.163377
0.183218
1.804592e-02
0.254142
-0.003326
0.339893
0.087014
0.149115
-0.010211
...
0.062175
0.195079
0.185377
-0.004484
0.001851
0.016692
0.039229
0.136822
0.068623
0.246042
0.106119
0.051735
0.182156
0.053607
0.113962
0.102831
0.025289
0.021798
0.127128
-0.033113
0.157031
0.069305
0.055702
0.049364
0.055275
feat_56
0.015738
0.018908
0.016288
1.961792e-02
0.001560
0.004014
-0.003145
0.002285
-0.022639
0.009786
0.033221
0.007665
0.002066
0.004436
-0.022519
0.021978
-0.012085
0.014345
8.823711e-03
0.012953
0.026285
0.003457
0.013547
-0.016847
0.014930
...
-0.002325
0.089528
-0.020002
-0.019436
-0.000268
0.032422
-0.020258
-0.004393
0.005466
-0.004568
0.013829
0.043107
-0.002544
0.023436
-0.005914
-0.003549
0.016538
0.072723
-0.005218
0.012290
0.053164
0.008288
-0.003705
0.028787
0.007368
feat_57
-0.014598
0.104263
0.018887
6.946199e-02
0.037162
0.036655
0.073969
0.011968
-0.049335
0.017525
-0.007302
0.042518
0.075397
-0.101382
-0.069910
0.049373
0.173169
0.284435
2.619428e-02
0.136076
-0.022876
0.185503
0.105362
0.208961
-0.110428
...
0.073636
0.028737
0.034841
-0.055874
0.006254
0.010981
0.056871
0.066284
0.006597
0.052778
0.040919
0.049279
0.084484
0.036242
0.144449
0.043523
-0.034281
-0.055232
0.102572
-0.068615
0.051941
0.087563
0.010013
0.100979
0.045129
feat_58
0.028696
0.371146
0.034506
3.910286e-02
-0.001546
0.110743
0.106203
0.025674
-0.028290
0.020933
0.179361
0.082574
0.132569
-0.059363
-0.044920
0.018927
0.040370
0.048980
2.345026e-01
0.310481
0.004761
0.057172
0.110441
0.370922
-0.091991
...
0.392839
0.038319
-0.020289
-0.024774
0.057573
0.058768
0.053835
0.365075
-0.001763
0.044968
0.023352
0.078906
0.001446
0.031475
0.235803
-0.005284
-0.015734
-0.026106
0.005737
-0.045055
0.022821
0.407341
0.056724
-0.014156
0.034472
feat_59
0.139364
-0.013283
-0.021717
-1.580168e-02
0.038893
0.001285
0.009624
0.118223
-0.021451
0.051189
-0.039646
0.026684
0.174173
-0.026693
-0.017773
0.046633
0.058797
0.099073
-8.377876e-03
0.020746
-0.024209
0.043127
0.004395
0.041787
0.089081
...
-0.003780
0.229503
0.015826
0.014535
0.015477
-0.002264
-0.000676
0.013573
0.048736
-0.007320
0.189173
-0.024168
0.055354
-0.017358
-0.011922
-0.002065
0.037042
0.148417
0.149499
-0.002890
0.013638
-0.011079
-0.003928
0.010828
0.047987
feat_60
-0.020267
0.018219
0.011369
6.890047e-02
0.008796
0.022289
0.012336
-0.020328
-0.066333
-0.000619
-0.025990
0.027027
0.046147
-0.118618
-0.079152
0.043111
0.090963
0.234885
-2.039351e-03
0.036649
-0.032751
0.166049
0.045739
0.108238
-0.131640
...
-0.014202
0.008525
0.020767
-0.062609
-0.019131
-0.006299
-0.006871
-0.011742
0.024297
-0.002122
0.029188
0.026903
0.055102
0.007344
0.068019
0.020174
-0.043094
-0.078149
0.099639
-0.091812
0.031379
-0.014203
0.011737
0.128371
0.055350
feat_61
-0.038124
0.063873
0.494881
3.544023e-01
0.001559
-0.002227
-0.006284
-0.045203
-0.044343
0.004397
0.543858
0.050080
-0.009951
-0.093809
-0.056169
0.150952
-0.030037
0.084745
-1.463294e-02
-0.027135
0.298307
0.136120
0.005186
-0.018301
-0.132584
...
-0.030713
-0.039559
-0.046490
-0.042920
-0.006029
0.104516
-0.036555
-0.020551
-0.001620
-0.020607
-0.019744
0.575101
0.007611
0.474954
-0.021269
0.002943
-0.047659
-0.083301
-0.027363
-0.070815
0.160405
-0.018112
-0.017045
0.042357
-0.001258
feat_62
0.091636
0.029040
0.010733
2.522773e-02
0.073215
0.022874
0.025485
0.214717
-0.021138
0.054733
-0.051554
0.126326
0.164920
-0.026822
0.097920
0.337177
0.105504
0.197339
-1.471355e-02
0.082477
0.037588
0.319551
0.005936
0.009847
0.077595
...
-0.011587
0.159988
0.089270
0.162735
-0.020436
0.001716
-0.044457
-0.001444
0.214828
0.001336
0.169192
0.027840
0.090282
0.026090
0.022311
0.012320
0.031092
-0.006503
0.081320
-0.048727
0.246666
0.000511
0.279845
0.327176
0.012774
feat_63
0.069799
0.037020
0.031241
5.990274e-02
0.082383
0.017050
0.039585
0.148673
0.013012
0.090421
0.013436
0.038777
0.070109
-0.020837
0.018947
0.104440
0.032391
0.119569
1.220845e-02
0.047287
0.015895
0.105657
0.018242
0.035863
0.023574
...
0.014252
0.141850
0.049447
0.030482
-0.000880
0.013859
-0.004057
0.026610
0.035163
0.006096
0.194618
0.066939
0.088525
0.052736
0.005221
0.013394
0.026954
0.080492
0.056860
0.026337
0.163009
0.018277
0.009487
0.031464
0.023298
feat_64
-0.010499
-0.005354
-0.065105
-4.728813e-02
-0.021017
-0.002764
0.011165
0.003194
0.702951
0.022536
-0.082868
-0.012603
-0.003286
-0.066800
0.003848
-0.043076
0.021551
-0.001922
-1.698588e-02
0.058862
-0.035863
-0.017547
0.029366
0.099968
-0.009117
...
-0.004913
0.029886
0.058239
-0.039423
0.018469
-0.007032
-0.011456
-0.000467
-0.019363
0.029796
-0.020209
-0.053117
0.010230
-0.047284
-0.001432
-0.013347
0.009820
0.064883
0.075964
0.117596
-0.015218
-0.001930
-0.020846
-0.071886
0.023064
feat_65
0.110041
0.078801
0.065492
6.228472e-02
0.228349
0.066867
0.202346
0.025544
-0.038163
0.182756
0.041014
0.121550
0.108567
-0.011517
0.034954
0.182735
0.072733
0.114910
1.579718e-02
0.143265
0.044702
0.180556
0.066825
0.074939
0.027847
...
0.015448
0.237573
0.045051
0.067476
0.006393
0.050069
0.014663
0.072170
0.115692
0.099054
0.066714
0.062834
0.127931
0.088551
0.057058
0.120772
0.066922
0.030362
0.050138
-0.011600
0.125884
0.029076
0.001188
0.044286
0.015500
feat_66
0.053010
0.175620
0.088017
1.296545e-01
0.048364
0.033285
0.122660
0.115175
-0.001778
0.100722
0.050764
0.084591
0.122272
-0.081709
-0.018487
0.289082
0.165143
0.285667
3.078749e-02
0.215576
0.022228
0.382385
0.114657
0.116615
-0.035200
...
0.079321
0.212931
0.072601
0.037809
0.006615
0.021841
0.005674
0.081866
0.147127
0.058784
0.085545
0.107806
0.110691
0.093782
0.088982
0.033794
0.031291
0.088686
0.206406
-0.000679
0.205289
0.094925
0.098063
0.123694
0.067957
feat_67
0.154301
0.068667
-0.110081
-8.045694e-02
0.061964
0.038289
0.148598
0.320949
0.176921
0.043117
-0.114944
0.079377
0.211335
-0.086404
-0.052486
-0.027728
0.306513
0.108521
3.587552e-02
0.199391
-0.075966
0.061315
0.115559
0.435017
-0.001504
...
0.117455
0.127927
0.131528
-0.035209
0.098574
0.027789
0.121735
0.323678
0.052970
0.188128
0.152964
-0.076209
0.075488
-0.070049
0.127444
0.039005
-0.020048
0.081445
0.295803
-0.058706
0.005220
0.089262
0.112052
-0.011247
0.129018
feat_68
0.014674
-0.012802
-0.030992
-2.009191e-02
0.107405
0.021619
0.040309
0.075384
-0.012192
0.001693
-0.040588
0.014873
0.026881
-0.076979
-0.049076
0.015351
0.038816
0.078124
-5.128506e-03
0.011770
-0.016256
0.036537
0.008228
0.096420
-0.032516
...
-0.017943
0.008038
0.196309
-0.035915
-0.004141
0.008749
-0.008292
0.013405
0.001655
0.084630
0.062394
0.004003
0.084468
0.000787
0.000817
0.036198
-0.024620
-0.038904
0.001672
-0.059843
0.125150
-0.023839
0.022515
0.095970
-0.004602
feat_69
0.007544
0.307406
-0.032748
-1.446082e-02
-0.003294
0.074836
0.131430
0.046258
-0.029335
0.077354
0.010697
0.057074
0.086611
-0.058352
-0.052924
-0.021699
0.100656
0.071926
3.976207e-01
0.348327
-0.044378
0.045054
0.143339
0.419143
-0.091005
...
1.000000
0.077678
-0.000955
-0.032479
0.045667
0.034456
0.065643
0.409691
0.002209
0.044834
0.036435
-0.014590
0.019568
-0.017440
0.204569
-0.007221
-0.006954
-0.025538
0.027690
-0.022918
0.011806
0.549489
0.041206
-0.037961
0.032052
feat_70
0.165442
0.112968
-0.018774
2.079779e-02
0.118510
0.052401
0.237907
0.023089
-0.056205
0.322857
-0.050540
0.071630
0.145294
0.054226
0.118795
0.255740
0.108789
0.169238
3.117130e-02
0.316436
-0.007589
0.231848
0.137402
0.098407
0.126664
...
0.077678
1.000000
0.088064
0.192594
-0.002807
0.010436
-0.016054
0.122972
0.067065
0.105585
0.144247
0.004856
0.192565
0.009776
0.069588
0.045636
0.361941
0.225792
0.212133
0.140850
0.163631
0.074178
0.030560
0.007310
0.093488
feat_71
0.013712
-0.002336
-0.053020
-4.241268e-02
0.056428
0.011901
0.115813
0.081664
0.043286
0.104834
-0.071717
0.028629
0.052337
-0.076935
-0.033807
0.001594
0.150286
0.074981
-2.087111e-02
0.098633
-0.040361
0.072454
0.056965
0.123829
-0.012005
...
-0.000955
0.088064
1.000000
-0.015731
-0.006121
-0.012457
0.025972
0.036850
-0.002787
0.099693
0.063309
-0.029219
0.132951
-0.025602
0.026032
0.011858
0.013894
-0.015410
0.060004
-0.048676
0.076348
-0.019694
0.050622
0.000368
0.002001
feat_72
-0.029983
-0.023267
-0.045339
-2.979578e-02
0.005177
-0.011090
-0.014921
-0.029868
-0.058147
0.004225
-0.086496
0.031888
-0.008543
-0.022358
0.764664
0.394998
-0.023103
-0.002190
-2.258364e-02
0.012246
0.072693
0.113100
-0.012941
-0.050412
0.015754
...
-0.032479
0.192594
-0.015731
1.000000
-0.020547
-0.024074
-0.043639
-0.040593
0.016045
-0.022904
-0.017522
-0.038547
0.002439
-0.028420
-0.034569
-0.011992
0.294384
0.008897
0.013536
0.004066
0.057040
-0.030673
-0.008936
0.005300
-0.008233
feat_73
0.140815
0.039192
-0.013972
-1.128547e-02
0.001609
0.025023
0.022819
0.028999
0.022679
0.000240
-0.006893
0.181833
0.030927
-0.040175
-0.019194
-0.026285
0.002826
-0.008668
5.683692e-02
0.029960
-0.009870
-0.012546
0.098657
0.048055
-0.041218
...
0.045667
-0.002807
-0.006121
-0.020547
1.000000
0.035633
0.190639
0.220832
-0.001634
0.018161
0.015294
-0.007285
0.002868
-0.005143
0.027131
0.012885
-0.010675
-0.000841
-0.004759
-0.026363
-0.006704
0.070001
0.007193
-0.024017
-0.000163
feat_74
0.051365
0.070724
0.041559
4.909735e-02
0.017265
0.043160
0.053059
-0.000431
0.007594
0.008912
0.186561
0.087012
0.021072
-0.033498
-0.013632
0.034125
0.001770
0.038278
2.261396e-02
0.033726
0.123215
0.040243
0.085407
0.113057
-0.040263
...
0.034456
0.010436
-0.012457
-0.024074
0.035633
1.000000
-0.002297
0.058736
0.009406
0.069628
0.016904
0.174775
0.008959
0.055336
0.024667
0.028021
-0.000453
-0.015945
0.003992
-0.025207
0.042104
0.055372
0.016941
0.004497
0.021967
feat_75
0.011596
0.093689
-0.044724
-3.145389e-02
0.015279
0.006951
0.039865
0.031466
-0.027313
0.003828
-0.034513
0.052536
0.022496
-0.079099
-0.048894
-0.059511
0.025506
-0.011790
3.635667e-02
0.053898
-0.047657
-0.020839
0.026762
0.156134
-0.059513
...
0.065643
-0.016054
0.025972
-0.043639
0.190639
-0.002297
1.000000
0.172375
-0.010904
0.013876
0.006885
-0.037439
0.011281
-0.022992
0.063463
0.028478
-0.026329
-0.031401
0.001201
-0.058630
-0.014925
0.160418
-0.002625
-0.037710
0.006208
feat_76
0.153808
0.259360
-0.028670
-1.379188e-02
0.035570
0.073867
0.375114
0.081682
-0.027424
0.106752
0.021357
0.113921
0.189751
-0.058841
-0.061486
-0.015473
0.134841
0.059279
1.795475e-01
0.359596
-0.041637
0.060522
0.208441
0.490795
-0.080761
...
0.409691
0.122972
0.036850
-0.040593
0.220832
0.058736
0.172375
1.000000
0.003596
0.329988
0.073039
0.004267
0.056915
-0.002648
0.291364
0.031277
0.000682
0.010324
0.063411
-0.050417
0.023242
0.291884
0.175163
-0.050887
0.029426
feat_77
0.123752
0.014911
-0.001584
1.531773e-02
0.030462
0.006501
0.005769
0.027486
-0.020185
0.019069
-0.008064
0.021418
0.178263
0.007714
-0.016673
0.129884
0.099665
0.102684
-7.950547e-03
0.017077
0.000132
0.132590
0.010125
0.033078
0.093776
...
0.002209
0.067065
-0.002787
0.016045
-0.001634
0.009406
-0.010904
0.003596
1.000000
0.005717
0.068083
0.009147
0.022957
0.003631
0.004793
0.001243
0.005602
0.020294
0.019275
-0.007396
0.021591
-0.004988
0.026376
0.076551
0.001715
feat_78
0.279202
0.094256
-0.021979
-1.449856e-02
0.070709
0.061250
0.567084
0.079623
-0.015922
0.091760
-0.009866
0.026525
0.224472
-0.041530
-0.027200
-0.003965
0.069539
0.042958
6.203551e-03
0.196400
-0.027742
0.084217
0.091219
0.223999
-0.038806
...
0.044834
0.105585
0.099693
-0.022904
0.018161
0.069628
0.013876
0.329988
0.005717
1.000000
0.036415
0.005224
0.047949
-0.012932
0.166186
0.035861
0.004071
-0.018797
0.063539
-0.030010
0.014639
0.043339
0.068450
-0.028596
0.016047
feat_79
0.228912
0.033668
-0.020566
-1.083473e-02
0.055115
0.009942
0.066753
0.083714
-0.036116
0.113659
-0.024491
0.036666
0.233604
-0.045473
-0.039033
0.045703
0.114670
0.080390
1.249856e-02
0.074068
-0.016071
0.093992
0.039133
0.083032
0.092507
...
0.036435
0.144247
0.063309
-0.017522
0.015294
0.016904
0.006885
0.073039
0.068083
0.036415
1.000000
0.005482
0.080384
-0.003830
0.028770
0.002540
0.004663
0.095254
0.099579
-0.018615
0.073207
0.031099
0.021616
0.162033
0.029082
feat_80
-0.013303
0.155768
0.442036
4.057725e-01
0.026223
0.017648
0.028860
-0.038382
-0.046721
0.019042
0.561843
0.064162
0.010431
-0.104507
-0.057948
0.205393
-0.015193
0.182025
-2.391456e-03
0.025698
0.279618
0.189565
0.037434
0.024981
-0.138356
...
-0.014590
0.004856
-0.029219
-0.038547
-0.007285
0.174775
-0.037439
0.004267
0.009147
0.005224
0.005482
1.000000
0.054154
0.522616
0.018600
0.029430
-0.035876
-0.081888
-0.004588
-0.076250
0.350787
0.012623
-0.017815
0.063401
0.012651
feat_81
0.032427
0.052101
0.013089
2.828377e-02
0.129333
0.044136
0.144308
0.035102
-0.005847
0.135928
0.004795
0.052954
0.065423
-0.037389
-0.019557
0.076330
0.123831
0.135914
8.124693e-04
0.144703
-0.004052
0.155515
0.116575
0.084692
-0.021283
...
0.019568
0.192565
0.132951
0.002439
0.002868
0.008959
0.011281
0.056915
0.022957
0.047949
0.080384
0.054154
1.000000
0.067345
0.049550
0.096658
0.054972
0.013808
0.084096
-0.017469
0.166234
0.009379
0.017243
0.018565
0.019378
feat_82
-0.026085
0.119109
0.438458
4.365413e-01
0.057400
0.014907
0.022059
-0.034409
-0.039806
0.029741
0.420361
0.054571
0.010074
-0.084089
-0.050702
0.168008
-0.012734
0.156176
-6.879326e-03
0.023469
0.221444
0.177255
0.028166
-0.016468
-0.114821
...
-0.017440
0.009776
-0.025602
-0.028420
-0.005143
0.055336
-0.022992
-0.002648
0.003631
-0.012932
-0.003830
0.522616
0.067345
1.000000
-0.003285
0.054646
-0.034368
-0.065189
-0.012153
-0.059553
0.266249
-0.001795
-0.014641
0.049661
0.005497
feat_83
0.059165
0.371691
-0.019914
-1.051874e-03
0.008006
0.035145
0.282069
0.033479
-0.032875
0.052025
-0.000190
0.036529
0.115747
-0.070643
-0.052090
0.017558
0.113987
0.085116
5.672093e-02
0.364803
-0.031940
0.083639
0.181739
0.300629
-0.092796
...
0.204569
0.069588
0.026032
-0.034569
0.027131
0.024667
0.063463
0.291364
0.004793
0.166186
0.028770
0.018600
0.049550
-0.003285
1.000000
-0.001047
-0.009157
-0.029711
0.072006
-0.052930
0.035181
0.243942
0.095801
-0.018325
0.054188
feat_84
0.049634
0.009845
0.011159
5.684499e-03
0.467329
0.177777
0.062634
0.005064
-0.013569
0.017939
0.017724
0.009807
0.023221
-0.027058
-0.009311
0.035211
0.000334
0.028752
-2.846741e-03
0.001723
-0.004702
0.041519
0.011832
0.091092
-0.018320
...
-0.007221
0.045636
0.011858
-0.011992
0.012885
0.028021
0.028478
0.031277
0.001243
0.035861
0.002540
0.029430
0.096658
0.054646
-0.001047
1.000000
-0.010210
-0.003459
0.013631
-0.017903
0.103643
-0.006013
-0.003444
0.048431
0.003723
feat_85
-0.008739
-0.006764
-0.048626
-3.315343e-02
0.034062
0.004290
0.037874
-0.003416
-0.031462
0.086758
-0.074293
0.019283
0.002594
-0.021455
0.246847
0.110850
0.015559
-0.001555
-8.292391e-03
0.084570
-0.006180
0.044396
0.056994
-0.018990
0.021119
...
-0.006954
0.361941
0.013894
0.294384
-0.010675
-0.000453
-0.026329
0.000682
0.005602
0.004071
0.004663
-0.035876
0.054972
-0.034368
-0.009157
-0.010210
1.000000
0.109643
0.049250
0.027886
0.053582
-0.003931
-0.023091
-0.043484
0.023390
feat_86
0.107947
-0.039090
-0.096093
-7.102916e-02
0.013879
0.010455
-0.009169
-0.029395
-0.019144
0.159447
-0.123339
-0.007214
0.004850
0.145787
-0.002529
0.003610
0.049102
-0.029295
-1.455986e-02
0.016850
-0.045562
-0.018347
0.121170
0.015444
0.263924
...
-0.025538
0.225792
-0.015410
0.008897
-0.000841
-0.015945
-0.031401
0.010324
0.020294
-0.018797
0.095254
-0.081888
0.013808
-0.065189
-0.029711
-0.003459
0.109643
1.000000
0.073685
0.426972
-0.011822
-0.019803
-0.024005
-0.049393
0.029035
feat_87
0.089374
0.047451
-0.009838
5.054728e-03
0.013999
0.015256
0.089574
0.059929
-0.016925
0.077421
-0.032969
0.016089
0.093870
-0.020229
-0.023191
0.077770
0.214221
0.126886
4.116324e-04
0.220475
-0.016862
0.219974
0.111837
0.123298
-0.011294
...
0.027690
0.212133
0.060004
0.013536
-0.004759
0.003992
0.001201
0.063411
0.019275
0.063539
0.099579
-0.004588
0.084096
-0.012153
0.072006
0.013631
0.049250
0.073685
1.000000
0.023053
0.066008
0.014696
0.028850
0.001424
0.499990
feat_88
0.020830
-0.047035
-0.082336
-6.748367e-02
-0.019201
-0.015437
-0.033646
-0.050931
0.001160
0.054635
-0.114491
-0.024324
-0.036259
0.323089
0.010840
-0.007257
-0.034139
-0.035981
-1.848491e-02
0.004081
-0.030401
-0.045439
-0.014039
-0.043479
0.207974
...
-0.022918
0.140850
-0.048676
0.004066
-0.026363
-0.025207
-0.058630
-0.050417
-0.007396
-0.030010
-0.018615
-0.076250
-0.017469
-0.059553
-0.052930
-0.017903
0.027886
0.426972
0.023053
1.000000
-0.022552
-0.031679
-0.033653
-0.070120
-0.008631
feat_89
0.096851
0.105527
0.174781
1.837145e-01
0.119951
0.035042
0.063511
0.007974
-0.019147
0.061498
0.137374
0.082220
0.062990
-0.038881
0.029547
0.248364
0.035390
0.247462
1.111595e-02
0.111231
0.105392
0.244779
0.059743
0.023581
-0.012866
...
0.011806
0.163631
0.076348
0.057040
-0.006704
0.042104
-0.014925
0.023242
0.021591
0.014639
0.073207
0.350787
0.166234
0.266249
0.035181
0.103643
0.053582
-0.011822
0.066008
-0.022552
1.000000
0.027764
0.015917
0.129622
0.030650
feat_90
0.010310
0.515022
-0.015068
9.454061e-03
0.004842
0.054034
0.129578
0.026807
-0.020698
0.049908
0.045074
0.062721
0.107722
-0.060240
-0.046616
0.016863
0.045218
0.094336
4.509254e-01
0.370282
-0.033193
0.098595
0.141869
0.357270
-0.088187
...
0.549489
0.074178
-0.019694
-0.030673
0.070001
0.055372
0.160418
0.291884
-0.004988
0.043339
0.031099
0.012623
0.009379
-0.001795
0.243942
-0.006013
-0.003931
-0.019803
0.014696
-0.031679
0.027764
1.000000
0.014812
-0.035311
0.039864
feat_91
0.037264
0.026383
-0.012417
-1.031241e-02
0.012012
0.012465
0.068506
0.095990
-0.014742
0.024025
-0.029511
0.063965
0.044338
-0.038444
-0.034402
0.048494
0.088508
0.037275
4.085393e-03
0.079181
-0.019779
0.104921
0.010438
0.090833
-0.045759
...
0.041206
0.030560
0.050622
-0.008936
0.007193
0.016941
-0.002625
0.175163
0.026376
0.068450
0.021616
-0.017815
0.017243
-0.014641
0.095801
-0.003444
-0.023091
-0.024005
0.028850
-0.033653
0.015917
0.014812
1.000000
0.104226
-0.000045
feat_92
0.054777
-0.008219
0.066921
8.763105e-02
0.065331
0.015479
-0.032261
0.013608
-0.069707
-0.006869
0.013179
0.063922
0.071953
-0.040133
-0.018206
0.210499
-0.006538
0.126640
-2.766153e-02
-0.018715
0.058008
0.200593
-0.031837
-0.024375
0.030135
...
-0.037961
0.007310
0.000368
0.005300
-0.024017
0.004497
-0.037710
-0.050887
0.076551
-0.028596
0.162033
0.063401
0.018565
0.049661
-0.018325
0.048431
-0.043484
-0.049393
0.001424
-0.070120
0.129622
-0.035311
0.104226
1.000000
-0.003653
feat_93
0.081783
0.054593
0.006814
1.574563e-02
0.002038
0.008521
0.034912
0.005131
-0.006038
0.041316
0.003326
0.012722
0.038989
-0.018127
-0.020369
0.031467
0.056695
0.058100
1.424267e-02
0.110054
-0.007677
0.113276
0.084945
0.089200
-0.015708
...
0.032052
0.093488
0.002001
-0.008233
-0.000163
0.021967
0.006208
0.029426
0.001715
0.016047
0.029082
0.012651
0.019378
0.005497
0.054188
0.003723
0.023390
0.029035
0.499990
-0.008631
0.030650
0.039864
-0.000045
-0.003653
1.000000
93 rows × 93 columns
In [11]:
sns.heatmap(cor_mat)
Out[11]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f566fdf8090>
In [12]:
#to have a better look we need to increase the plot area
f, ax = plt.subplots(figsize=(20,17))
sns.heatmap(cor_mat,vmax=0.8,square=True)
Out[12]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f566ff18710>
we would also like to explore the correlation with the target variable but it has a character type so let's convert it into a numerical feature
In [13]:
tr_data['parsed_target'] = [int(n.split('_')[1]) for n in tr_data.target]
tr_data.drop('target',axis=1,inplace=True)
f, ax = plt.subplots(figsize=(20,17))
cor_mat = tr_data.iloc[:,1:].corr()
sns.heatmap(cor_mat,vmax=0.8,square=True)
Out[13]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f566fe46bd0>
In [14]:
f, ax = plt.subplots(figsize = (20,5))
cor_mat.iloc[:-1,-1].plot(kind = 'bar')
Out[14]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f5662c89110>
we can notice some features with weak positive corelation with the target and some others with moderate negative correlation
In [15]:
def target_bar_plots(dat,cols = 4, rows = 4):
_,ax = plt.subplots(rows,cols,sharey='row',sharex='col',figsize = (cols*5,rows*5))
for i,feat in enumerate(dat.columns[:(rows*cols)]):
try:
dat.pivot_table(index=['parsed_target'],values=dat.columns[i],aggfunc=np.count_nonzero).plot(
kind = 'bar',color=my_color_map ,ax=ax[i/cols, i%cols],title =
'non_zero values by category for {}'.format(feat))
except:
pass
target_bar_plots(tr_data,4,24)
while examining these plots we can already make some assumptions
as to which categories will be easier to predict and which will be the harder ones - can you guess?
now lets look at the test set features and check if they resemble the train features
In [16]:
tr_data['source'] = 'train'
te_data['source'] = 'test'
all_data = pd.concat([tr_data,te_data],axis=0)
tr_data.drop('source',axis=1,inplace=True)
te_data.drop('source',axis=1,inplace=True)
In [17]:
[x for x in all_data.columns[1:7]]
Out[17]:
['feat_10', 'feat_11', 'feat_12', 'feat_13', 'feat_14', 'feat_15']
In [18]:
molten = pd.melt(all_data, id_vars = 'source',value_vars = ['feat_'+str(x) for x in range(13,15)])
plt.subplots(figsize = (20,8))
sns.violinplot(data=molten, x= 'variable',y='value',hue='source',split = True,palette=my_color_map)
Out[18]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f5653919b10>
In [19]:
from sklearn.model_selection import train_test_split
X_train, X_val, y_train, y_val = train_test_split(tr_data.iloc[:,1:-1],tr_data.parsed_target,test_size = 0.2,random_state =12345)
In [20]:
from sklearn.neighbors import KNeighborsClassifier
from sklearn.metrics import confusion_matrix
knn = KNeighborsClassifier(n_jobs=4,n_neighbors=4)
knn.fit(X_train,y_train)
knn4_pred = knn.predict(X_val)
print confusion_matrix(y_pred=knn4_pred,y_true=y_val)
sns.heatmap(xticklabels=range(1,10),yticklabels=range(1,10),data = confusion_matrix(y_pred=knn4_pred,y_true=y_val),cmap='Greens')
[[ 247 20 3 2 3 20 9 34 72]
[ 5 2723 413 47 6 4 15 5 4]
[ 1 817 663 41 1 1 21 4 1]
[ 3 225 126 169 8 11 13 1 1]
[ 1 11 3 0 520 0 0 0 0]
[ 34 29 7 6 2 2652 32 28 20]
[ 31 80 55 10 4 37 300 18 3]
[ 51 21 8 0 2 60 24 1496 33]
[ 71 27 6 2 4 32 2 36 879]]
Out[20]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f564eed04d0>
as we can see our assamption was indeed correct - categories 6,8 and 2 are those easiest to predict
In [21]:
from sklearn.metrics import classification_report
print 'classification report results:\r\n' + classification_report(y_pred=knn4_pred,y_true=y_val)
classification report results:
precision recall f1-score support
1 0.56 0.60 0.58 410
2 0.69 0.85 0.76 3222
3 0.52 0.43 0.47 1550
4 0.61 0.30 0.41 557
5 0.95 0.97 0.96 535
6 0.94 0.94 0.94 2810
7 0.72 0.56 0.63 538
8 0.92 0.88 0.90 1695
9 0.87 0.83 0.85 1059
avg / total 0.78 0.78 0.77 12376
In [22]:
#this will give higher importance to successfully classifying the 4th class items
class_weights = {1:1,2:1,3:1,4:10,5:1,6:1,7:1,8:1,9:1}
from sklearn.tree import DecisionTreeClassifier
dtc = DecisionTreeClassifier(class_weight=class_weights,max_depth=100,max_features=92,min_samples_split=2,random_state=12345)
dtc.fit(X_train,y_train)
tree_pred = dtc.predict(X_val)
print confusion_matrix(y_pred=tree_pred,y_true=y_val)
sns.heatmap(confusion_matrix(y_pred=tree_pred,y_true=y_val),cmap='Greens',xticklabels=range(1,10),yticklabels=range(1,10))
print 'classification report results:\r\n' + classification_report(y_pred=tree_pred,y_true=y_val)
[[ 143 21 10 2 3 44 21 71 95]
[ 19 2197 679 175 13 35 40 30 34]
[ 2 630 738 79 4 16 53 17 11]
[ 4 144 67 272 5 32 21 9 3]
[ 0 13 5 3 508 1 2 1 2]
[ 33 38 26 29 1 2512 55 69 47]
[ 21 64 52 18 3 46 260 44 30]
[ 57 30 24 6 8 101 42 1365 62]
[ 80 62 15 8 8 54 25 57 750]]
classification report results:
precision recall f1-score support
1 0.40 0.35 0.37 410
2 0.69 0.68 0.68 3222
3 0.46 0.48 0.47 1550
4 0.46 0.49 0.47 557
5 0.92 0.95 0.93 535
6 0.88 0.89 0.89 2810
7 0.50 0.48 0.49 538
8 0.82 0.81 0.81 1695
9 0.73 0.71 0.72 1059
avg / total 0.71 0.71 0.71 12376
lets see if support vector machines will do any better
In [23]:
from sklearn.svm import SVC
svc = SVC(kernel='linear',C=0.1,max_iter=1000,random_state=12345)
svc.fit(X_train,y_train)
svc_pred = svc.predict(X_val)
print confusion_matrix(y_pred=svc_pred,y_true=y_val)
sns.heatmap(confusion_matrix(y_pred=svc_pred,y_true=y_val),cmap='Greens',xticklabels=range(1,10),
yticklabels=range(1,10))
print 'classification report results:\r\n' + classification_report(y_pred=svc_pred,y_true=y_val)
/usr/local/lib/python2.7/dist-packages/sklearn/svm/base.py:220: ConvergenceWarning: Solver terminated early (max_iter=1000). Consider pre-processing your data with StandardScaler or MinMaxScaler.
% self.max_iter, ConvergenceWarning)
[[ 46 2 1 6 2 3 53 79 218]
[ 5 661 530 1417 32 0 516 34 27]
[ 1 292 437 514 7 0 272 18 9]
[ 3 79 28 349 9 1 78 1 9]
[ 0 1 6 3 523 0 1 0 1]
[ 326 35 37 181 0 788 947 241 255]
[ 20 42 33 25 1 18 353 30 16]
[ 97 4 14 3 2 100 232 923 320]
[ 190 1 1 13 2 17 11 33 791]]
classification report results:
precision recall f1-score support
1 0.07 0.11 0.08 410
2 0.59 0.21 0.30 3222
3 0.40 0.28 0.33 1550
4 0.14 0.63 0.23 557
5 0.90 0.98 0.94 535
6 0.85 0.28 0.42 2810
7 0.14 0.66 0.24 538
8 0.68 0.54 0.60 1695
9 0.48 0.75 0.58 1059
avg / total 0.59 0.39 0.41 12376
we can see that we get less accurate results on the whole,
but we achieved better results for the class we selected to be more important
lets try ensamble learning - we'll start with random forest
In [24]:
from sklearn.ensemble import RandomForestClassifier
rfc = RandomForestClassifier(n_jobs=4,min_impurity_split=1e-05,n_estimators=100)
rfc.fit(X_train,y_train)
rfc_pred = rfc.predict(X_val)
print confusion_matrix(y_pred=rfc_pred,y_true=y_val)
sns.heatmap(confusion_matrix(y_pred=rfc_pred,y_true=y_val),cmap='Greens',xticklabels=range(1,10),yticklabels=range(1,10))
print 'classification report results:\r\n' + classification_report(y_pred=rfc_pred,y_true=y_val)
[[ 169 16 1 0 0 35 8 85 96]
[ 0 2846 335 10 4 6 9 5 7]
[ 0 733 780 14 0 2 13 7 1]
[ 1 211 77 232 3 20 10 3 0]
[ 0 10 1 0 521 1 0 1 1]
[ 6 27 2 2 1 2698 22 35 17]
[ 4 71 41 9 1 44 319 42 7]
[ 12 12 4 0 2 50 6 1587 22]
[ 19 24 2 0 2 29 4 38 941]]
classification report results:
precision recall f1-score support
1 0.80 0.41 0.54 410
2 0.72 0.88 0.79 3222
3 0.63 0.50 0.56 1550
4 0.87 0.42 0.56 557
5 0.98 0.97 0.97 535
6 0.94 0.96 0.95 2810
7 0.82 0.59 0.69 538
8 0.88 0.94 0.91 1695
9 0.86 0.89 0.87 1059
avg / total 0.82 0.82 0.81 12376
yes!
rf model got highest score so far with no special effort just applying fit - predict
let's check if gradient boosting can further improve on that
In [25]:
from sklearn.ensemble import GradientBoostingClassifier
gbc = GradientBoostingClassifier(min_impurity_split=1e-05,n_estimators=100,max_depth=7)
gbc.fit(X_train,y_train)
gbc_pred = gbc.predict(X_val)
print confusion_matrix(y_pred=gbc_pred,y_true=y_val)
sns.heatmap(confusion_matrix(y_pred=gbc_pred,y_true=y_val),cmap='Greens',xticklabels=range(1,10),yticklabels=range(1,10))
print 'classification report results:\r\n' + classification_report(y_pred=gbc_pred,y_true=y_val)
[[ 214 18 2 0 1 30 11 50 84]
[ 1 2724 419 33 5 5 19 14 2]
[ 0 620 839 37 0 3 43 5 3]
[ 0 162 83 274 4 16 13 3 2]
[ 1 5 4 0 524 1 0 0 0]
[ 12 17 6 8 0 2687 32 30 18]
[ 7 53 44 4 1 32 375 22 0]
[ 24 9 5 1 3 45 17 1566 25]
[ 35 20 1 1 1 29 3 37 932]]
classification report results:
precision recall f1-score support
1 0.73 0.52 0.61 410
2 0.75 0.85 0.80 3222
3 0.60 0.54 0.57 1550
4 0.77 0.49 0.60 557
5 0.97 0.98 0.98 535
6 0.94 0.96 0.95 2810
7 0.73 0.70 0.71 538
8 0.91 0.92 0.92 1695
9 0.87 0.88 0.88 1059
avg / total 0.82 0.82 0.81 12376
wow! we got an average F1 score of 82% this looks great!
lets predict the results on the test set using the gradient boosting model and create a submission to the kaggle platform
In [26]:
test_pred = gbc.predict_proba(te_data.iloc[:,1:])
In [27]:
subm = pd.DataFrame(test_pred)
subm.columns = ['class_'+ str(x) for x in range(1,10)]
subm.index = te_data.id
subm.to_csv('../subm/gradient_boosting_classifier_submission.csv')
In [28]:
#lets make sure our prediction fits the desired format:
print subm.head()
print 'submission shape: {}'.format(subm.shape)
print ''
print "great! we're good to go on and submit our results"
class_1 class_2 class_3 class_4 class_5 class_6 class_7 \
id
1 0.000588 0.102009 0.200167 0.684847 0.000030 0.001411 0.007599
2 0.003356 0.090358 0.009055 0.004860 0.000256 0.724626 0.021677
3 0.000420 0.000457 0.000444 0.000575 0.000006 0.995546 0.000392
4 0.001570 0.710096 0.249005 0.028145 0.000029 0.001276 0.002030
5 0.025541 0.004113 0.002842 0.000935 0.000610 0.012384 0.003886
class_8 class_9
id
1 0.002586 0.000762
2 0.142966 0.002846
3 0.001804 0.000357
4 0.001777 0.006072
5 0.162684 0.787004
submission shape: (144368, 9)
great! we're good to go on and submit our results
Content source: QuantScientist/Deep-Learning-Boot-Camp
Similar notebooks: